51385 1727204580.87482: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 51385 1727204580.88932: Added group all to inventory 51385 1727204580.88934: Added group ungrouped to inventory 51385 1727204580.88939: Group all now contains ungrouped 51385 1727204580.88942: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 51385 1727204581.23428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 51385 1727204581.23505: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 51385 1727204581.23530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 51385 1727204581.23616: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 51385 1727204581.23709: Loaded config def from plugin (inventory/script) 51385 1727204581.23712: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 51385 1727204581.23754: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 51385 1727204581.23854: Loaded config def from plugin (inventory/yaml) 51385 1727204581.23857: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 51385 1727204581.23947: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 51385 1727204581.25309: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 51385 1727204581.25313: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 51385 1727204581.25317: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 51385 1727204581.25324: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 51385 1727204581.25329: Loading data from /tmp/network-M6W/inventory-5vW.yml 51385 1727204581.25406: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 51385 1727204581.25606: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 51385 1727204581.25876: Loading data from /tmp/network-M6W/inventory-5vW.yml 51385 1727204581.25986: group all already in inventory 51385 1727204581.25996: set inventory_file for managed-node1 51385 1727204581.26003: set inventory_dir for managed-node1 51385 1727204581.26005: Added host managed-node1 to inventory 51385 1727204581.26009: Added host managed-node1 to group all 51385 1727204581.26010: set ansible_host for managed-node1 51385 1727204581.26011: set ansible_ssh_extra_args for managed-node1 51385 1727204581.26017: set inventory_file for managed-node2 51385 1727204581.26021: set inventory_dir for managed-node2 51385 1727204581.26022: Added host managed-node2 to inventory 51385 1727204581.26026: Added host managed-node2 to group all 51385 1727204581.26027: set ansible_host for managed-node2 51385 1727204581.26028: set ansible_ssh_extra_args for managed-node2 51385 1727204581.26034: set inventory_file for managed-node3 51385 1727204581.26038: set inventory_dir for managed-node3 51385 1727204581.26039: Added host managed-node3 to inventory 51385 1727204581.26040: Added host managed-node3 to group all 51385 1727204581.26041: set ansible_host for managed-node3 51385 1727204581.26042: set ansible_ssh_extra_args for managed-node3 51385 1727204581.26045: Reconcile groups and hosts in inventory. 51385 1727204581.26050: Group ungrouped now contains managed-node1 51385 1727204581.26055: Group ungrouped now contains managed-node2 51385 1727204581.26057: Group ungrouped now contains managed-node3 51385 1727204581.26146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 51385 1727204581.26476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 51385 1727204581.26523: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 51385 1727204581.26553: Loaded config def from plugin (vars/host_group_vars) 51385 1727204581.26555: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 51385 1727204581.26563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 51385 1727204581.26572: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 51385 1727204581.26615: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 51385 1727204581.27967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204581.28072: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 51385 1727204581.28113: Loaded config def from plugin (connection/local) 51385 1727204581.28117: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 51385 1727204581.29434: Loaded config def from plugin (connection/paramiko_ssh) 51385 1727204581.29438: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 51385 1727204581.31502: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51385 1727204581.31542: Loaded config def from plugin (connection/psrp) 51385 1727204581.31544: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 51385 1727204581.32288: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51385 1727204581.32327: Loaded config def from plugin (connection/ssh) 51385 1727204581.32330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 51385 1727204581.32676: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51385 1727204581.32714: Loaded config def from plugin (connection/winrm) 51385 1727204581.32717: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 51385 1727204581.32747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 51385 1727204581.32811: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 51385 1727204581.32878: Loaded config def from plugin (shell/cmd) 51385 1727204581.32880: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 51385 1727204581.32913: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 51385 1727204581.33003: Loaded config def from plugin (shell/powershell) 51385 1727204581.33005: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 51385 1727204581.33074: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 51385 1727204581.33245: Loaded config def from plugin (shell/sh) 51385 1727204581.33247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 51385 1727204581.33284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 51385 1727204581.33407: Loaded config def from plugin (become/runas) 51385 1727204581.33409: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 51385 1727204581.33895: Loaded config def from plugin (become/su) 51385 1727204581.33897: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 51385 1727204581.34061: Loaded config def from plugin (become/sudo) 51385 1727204581.34065: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 51385 1727204581.34103: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 51385 1727204581.34844: in VariableManager get_vars() 51385 1727204581.34870: done with get_vars() 51385 1727204581.35125: trying /usr/local/lib/python3.12/site-packages/ansible/modules 51385 1727204581.41223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 51385 1727204581.41358: in VariableManager get_vars() 51385 1727204581.41366: done with get_vars() 51385 1727204581.41369: variable 'playbook_dir' from source: magic vars 51385 1727204581.41370: variable 'ansible_playbook_python' from source: magic vars 51385 1727204581.41371: variable 'ansible_config_file' from source: magic vars 51385 1727204581.41372: variable 'groups' from source: magic vars 51385 1727204581.41373: variable 'omit' from source: magic vars 51385 1727204581.41374: variable 'ansible_version' from source: magic vars 51385 1727204581.41375: variable 'ansible_check_mode' from source: magic vars 51385 1727204581.41375: variable 'ansible_diff_mode' from source: magic vars 51385 1727204581.41376: variable 'ansible_forks' from source: magic vars 51385 1727204581.41377: variable 'ansible_inventory_sources' from source: magic vars 51385 1727204581.41377: variable 'ansible_skip_tags' from source: magic vars 51385 1727204581.41378: variable 'ansible_limit' from source: magic vars 51385 1727204581.41379: variable 'ansible_run_tags' from source: magic vars 51385 1727204581.41380: variable 'ansible_verbosity' from source: magic vars 51385 1727204581.41421: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 51385 1727204581.41838: in VariableManager get_vars() 51385 1727204581.41855: done with get_vars() 51385 1727204581.41896: in VariableManager get_vars() 51385 1727204581.41911: done with get_vars() 51385 1727204581.41951: in VariableManager get_vars() 51385 1727204581.41975: done with get_vars() 51385 1727204581.42121: in VariableManager get_vars() 51385 1727204581.42135: done with get_vars() 51385 1727204581.42140: variable 'omit' from source: magic vars 51385 1727204581.42164: variable 'omit' from source: magic vars 51385 1727204581.42200: in VariableManager get_vars() 51385 1727204581.42211: done with get_vars() 51385 1727204581.42262: in VariableManager get_vars() 51385 1727204581.42279: done with get_vars() 51385 1727204581.42316: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51385 1727204581.42685: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51385 1727204581.42820: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51385 1727204581.44625: in VariableManager get_vars() 51385 1727204581.44801: done with get_vars() 51385 1727204581.46609: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 51385 1727204581.46828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51385 1727204581.50196: in VariableManager get_vars() 51385 1727204581.50228: done with get_vars() 51385 1727204581.50286: in VariableManager get_vars() 51385 1727204581.50354: done with get_vars() 51385 1727204581.50869: in VariableManager get_vars() 51385 1727204581.50885: done with get_vars() 51385 1727204581.50890: variable 'omit' from source: magic vars 51385 1727204581.50901: variable 'omit' from source: magic vars 51385 1727204581.50957: in VariableManager get_vars() 51385 1727204581.50989: done with get_vars() 51385 1727204581.51041: in VariableManager get_vars() 51385 1727204581.51085: done with get_vars() 51385 1727204581.51130: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51385 1727204581.51250: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51385 1727204581.51407: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51385 1727204581.52418: in VariableManager get_vars() 51385 1727204581.52441: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51385 1727204581.55649: in VariableManager get_vars() 51385 1727204581.55684: done with get_vars() 51385 1727204581.55748: in VariableManager get_vars() 51385 1727204581.55800: done with get_vars() 51385 1727204581.55892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 51385 1727204581.55907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 51385 1727204581.56287: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 51385 1727204581.56477: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 51385 1727204581.56480: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 51385 1727204581.56510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 51385 1727204581.56534: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 51385 1727204581.56733: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 51385 1727204581.56797: Loaded config def from plugin (callback/default) 51385 1727204581.56800: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 51385 1727204581.58980: Loaded config def from plugin (callback/junit) 51385 1727204581.58983: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 51385 1727204581.59254: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 51385 1727204581.59319: Loaded config def from plugin (callback/minimal) 51385 1727204581.59321: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 51385 1727204581.59476: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 51385 1727204581.59536: Loaded config def from plugin (callback/tree) 51385 1727204581.59538: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 51385 1727204581.59795: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 51385 1727204581.59798: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 51385 1727204581.59825: in VariableManager get_vars() 51385 1727204581.59839: done with get_vars() 51385 1727204581.59845: in VariableManager get_vars() 51385 1727204581.59853: done with get_vars() 51385 1727204581.59857: variable 'omit' from source: magic vars 51385 1727204581.59907: in VariableManager get_vars() 51385 1727204581.59923: done with get_vars() 51385 1727204581.59944: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 51385 1727204581.62511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 51385 1727204581.62608: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 51385 1727204581.62639: getting the remaining hosts for this loop 51385 1727204581.62641: done getting the remaining hosts for this loop 51385 1727204581.62645: getting the next task for host managed-node1 51385 1727204581.62649: done getting next task for host managed-node1 51385 1727204581.62651: ^ task is: TASK: Gathering Facts 51385 1727204581.62652: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204581.62655: getting variables 51385 1727204581.62656: in VariableManager get_vars() 51385 1727204581.62669: Calling all_inventory to load vars for managed-node1 51385 1727204581.62673: Calling groups_inventory to load vars for managed-node1 51385 1727204581.62676: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204581.62688: Calling all_plugins_play to load vars for managed-node1 51385 1727204581.62699: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204581.62703: Calling groups_plugins_play to load vars for managed-node1 51385 1727204581.62739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204581.62794: done with get_vars() 51385 1727204581.62801: done getting variables 51385 1727204581.62868: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Tuesday 24 September 2024 15:03:01 -0400 (0:00:00.032) 0:00:00.032 ***** 51385 1727204581.62891: entering _queue_task() for managed-node1/gather_facts 51385 1727204581.62893: Creating lock for gather_facts 51385 1727204581.63235: worker is 1 (out of 1 available) 51385 1727204581.63245: exiting _queue_task() for managed-node1/gather_facts 51385 1727204581.63258: done queuing things up, now waiting for results queue to drain 51385 1727204581.63260: waiting for pending results... 51385 1727204581.63512: running TaskExecutor() for managed-node1/TASK: Gathering Facts 51385 1727204581.63643: in run() - task 0affcd87-79f5-6b1f-5706-0000000000af 51385 1727204581.63662: variable 'ansible_search_path' from source: unknown 51385 1727204581.63728: calling self._execute() 51385 1727204581.63809: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204581.63824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204581.63835: variable 'omit' from source: magic vars 51385 1727204581.63943: variable 'omit' from source: magic vars 51385 1727204581.63983: variable 'omit' from source: magic vars 51385 1727204581.64028: variable 'omit' from source: magic vars 51385 1727204581.64087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204581.64136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204581.64166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204581.64200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204581.64216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204581.64256: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204581.64267: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204581.64275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204581.64390: Set connection var ansible_pipelining to False 51385 1727204581.64403: Set connection var ansible_shell_type to sh 51385 1727204581.64420: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204581.64432: Set connection var ansible_timeout to 10 51385 1727204581.64439: Set connection var ansible_connection to ssh 51385 1727204581.64449: Set connection var ansible_shell_executable to /bin/sh 51385 1727204581.64478: variable 'ansible_shell_executable' from source: unknown 51385 1727204581.64489: variable 'ansible_connection' from source: unknown 51385 1727204581.64495: variable 'ansible_module_compression' from source: unknown 51385 1727204581.64502: variable 'ansible_shell_type' from source: unknown 51385 1727204581.64511: variable 'ansible_shell_executable' from source: unknown 51385 1727204581.64523: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204581.64534: variable 'ansible_pipelining' from source: unknown 51385 1727204581.64546: variable 'ansible_timeout' from source: unknown 51385 1727204581.64554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204581.64795: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204581.64812: variable 'omit' from source: magic vars 51385 1727204581.64820: starting attempt loop 51385 1727204581.64825: running the handler 51385 1727204581.64849: variable 'ansible_facts' from source: unknown 51385 1727204581.64873: _low_level_execute_command(): starting 51385 1727204581.64908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204581.66373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204581.66404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204581.66410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204581.66413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204581.66415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204581.66474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204581.66490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204581.66569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204581.68303: stdout chunk (state=3): >>>/root <<< 51385 1727204581.68380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204581.68512: stderr chunk (state=3): >>><<< 51385 1727204581.68526: stdout chunk (state=3): >>><<< 51385 1727204581.68656: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204581.68663: _low_level_execute_command(): starting 51385 1727204581.68669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861 `" && echo ansible-tmp-1727204581.6855466-51486-280200456964861="` echo /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861 `" ) && sleep 0' 51385 1727204581.69666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204581.69692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204581.69710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204581.69731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204581.69788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204581.69807: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204581.69821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204581.69846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204581.69868: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204581.69880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204581.69893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204581.69908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204581.69929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204581.69943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204581.69957: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204581.69988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204581.70080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204581.70109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204581.70129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204581.70227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204581.72088: stdout chunk (state=3): >>>ansible-tmp-1727204581.6855466-51486-280200456964861=/root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861 <<< 51385 1727204581.72291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204581.72321: stderr chunk (state=3): >>><<< 51385 1727204581.72324: stdout chunk (state=3): >>><<< 51385 1727204581.72420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204581.6855466-51486-280200456964861=/root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204581.72470: variable 'ansible_module_compression' from source: unknown 51385 1727204581.72874: ANSIBALLZ: Using generic lock for ansible.legacy.setup 51385 1727204581.72877: ANSIBALLZ: Acquiring lock 51385 1727204581.72879: ANSIBALLZ: Lock acquired: 140124837667440 51385 1727204581.72881: ANSIBALLZ: Creating module 51385 1727204582.51543: ANSIBALLZ: Writing module into payload 51385 1727204582.52177: ANSIBALLZ: Writing module 51385 1727204582.52210: ANSIBALLZ: Renaming module 51385 1727204582.52214: ANSIBALLZ: Done creating module 51385 1727204582.52392: variable 'ansible_facts' from source: unknown 51385 1727204582.52399: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204582.52410: _low_level_execute_command(): starting 51385 1727204582.52416: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 51385 1727204582.53609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204582.53628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.53639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.53654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.53706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.53713: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204582.53724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.53736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204582.53743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204582.53750: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204582.53758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.53773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.53799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.53818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.53839: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204582.53863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.54036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204582.54075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204582.54346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204582.54445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204582.56372: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 51385 1727204582.56541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204582.56545: stdout chunk (state=3): >>><<< 51385 1727204582.56547: stderr chunk (state=3): >>><<< 51385 1727204582.56711: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204582.56723 [managed-node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 51385 1727204582.56726: _low_level_execute_command(): starting 51385 1727204582.56729: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 51385 1727204582.56848: Sending initial data 51385 1727204582.56852: Sent initial data (1181 bytes) 51385 1727204582.57371: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204582.57381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.57392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.57405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.57440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.57448: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204582.57457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.57476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204582.57482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204582.57488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204582.57495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.57504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.57516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.57522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.57528: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204582.57537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.57605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204582.57622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204582.57634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204582.57721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204582.62472: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 51385 1727204582.62985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204582.62989: stdout chunk (state=3): >>><<< 51385 1727204582.62994: stderr chunk (state=3): >>><<< 51385 1727204582.63007: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204582.63080: variable 'ansible_facts' from source: unknown 51385 1727204582.63084: variable 'ansible_facts' from source: unknown 51385 1727204582.63096: variable 'ansible_module_compression' from source: unknown 51385 1727204582.63135: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 51385 1727204582.63169: variable 'ansible_facts' from source: unknown 51385 1727204582.63313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/AnsiballZ_setup.py 51385 1727204582.63472: Sending initial data 51385 1727204582.63476: Sent initial data (154 bytes) 51385 1727204582.65477: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204582.65873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.65884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.65898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.65942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.65948: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204582.65958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.65976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204582.65984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204582.65990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204582.65998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.66006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.66019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.66028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.66031: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204582.66040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.66117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204582.66137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204582.66147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204582.66231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204582.67934: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204582.67985: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204582.68116: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpzkted33q /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/AnsiballZ_setup.py <<< 51385 1727204582.68176: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204582.71747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204582.71752: stderr chunk (state=3): >>><<< 51385 1727204582.71755: stdout chunk (state=3): >>><<< 51385 1727204582.71757: done transferring module to remote 51385 1727204582.71759: _low_level_execute_command(): starting 51385 1727204582.71761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/ /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/AnsiballZ_setup.py && sleep 0' 51385 1727204582.72658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.72663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.72704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.72711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.72726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.72730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.72809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204582.72813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204582.72828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204582.72912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204582.75217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204582.75221: stdout chunk (state=3): >>><<< 51385 1727204582.75223: stderr chunk (state=3): >>><<< 51385 1727204582.75319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204582.75323: _low_level_execute_command(): starting 51385 1727204582.75325: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/AnsiballZ_setup.py && sleep 0' 51385 1727204582.76863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204582.76941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.76951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.76970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.77021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.77028: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204582.77039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.77056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204582.77068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204582.77074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204582.77082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204582.77092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204582.77111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204582.77122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204582.77128: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204582.77138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204582.77226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204582.77245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204582.77257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204582.77360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204582.80098: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 51385 1727204582.80102: stdout chunk (state=3): >>>import _imp # builtin <<< 51385 1727204582.80151: stdout chunk (state=3): >>>import '_thread' # <<< 51385 1727204582.80170: stdout chunk (state=3): >>>import '_warnings' # <<< 51385 1727204582.80175: stdout chunk (state=3): >>> <<< 51385 1727204582.80178: stdout chunk (state=3): >>>import '_weakref' # <<< 51385 1727204582.80287: stdout chunk (state=3): >>>import '_io' # <<< 51385 1727204582.80291: stdout chunk (state=3): >>>import 'marshal' # <<< 51385 1727204582.80356: stdout chunk (state=3): >>>import 'posix' # <<< 51385 1727204582.80365: stdout chunk (state=3): >>> <<< 51385 1727204582.80408: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 51385 1727204582.80423: stdout chunk (state=3): >>> <<< 51385 1727204582.80426: stdout chunk (state=3): >>># installing zipimport hook <<< 51385 1727204582.80486: stdout chunk (state=3): >>>import 'time' # <<< 51385 1727204582.80521: stdout chunk (state=3): >>>import 'zipimport' # <<< 51385 1727204582.80529: stdout chunk (state=3): >>> <<< 51385 1727204582.80532: stdout chunk (state=3): >>># installed zipimport hook <<< 51385 1727204582.80609: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 51385 1727204582.80625: stdout chunk (state=3): >>> <<< 51385 1727204582.80628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204582.80674: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py<<< 51385 1727204582.80679: stdout chunk (state=3): >>> <<< 51385 1727204582.80708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 51385 1727204582.80735: stdout chunk (state=3): >>>import '_codecs' # <<< 51385 1727204582.80740: stdout chunk (state=3): >>> <<< 51385 1727204582.80783: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373dc0> <<< 51385 1727204582.80839: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 51385 1727204582.80888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc'<<< 51385 1727204582.80896: stdout chunk (state=3): >>> <<< 51385 1727204582.80912: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37093183a0> <<< 51385 1727204582.80915: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373b20> <<< 51385 1727204582.80970: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 51385 1727204582.80976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 51385 1727204582.81015: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373ac0> <<< 51385 1727204582.81052: stdout chunk (state=3): >>>import '_signal' # <<< 51385 1727204582.81108: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 51385 1727204582.81113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 51385 1727204582.81154: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318490> <<< 51385 1727204582.81196: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 51385 1727204582.81200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 51385 1727204582.81242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 51385 1727204582.81265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 51385 1727204582.81314: stdout chunk (state=3): >>>import '_abc' # <<< 51385 1727204582.81318: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318940> <<< 51385 1727204582.81365: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318670> <<< 51385 1727204582.81432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 51385 1727204582.81435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 51385 1727204582.81490: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 51385 1727204582.81523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 51385 1727204582.81572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 51385 1727204582.81644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 51385 1727204582.81647: stdout chunk (state=3): >>>import '_stat' # <<< 51385 1727204582.81650: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf190> <<< 51385 1727204582.81685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 51385 1727204582.81728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 51385 1727204582.81844: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf220> <<< 51385 1727204582.81886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 51385 1727204582.81916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc'<<< 51385 1727204582.81919: stdout chunk (state=3): >>> <<< 51385 1727204582.81988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 51385 1727204582.82004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 51385 1727204582.82008: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092f2850> <<< 51385 1727204582.82013: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf940> <<< 51385 1727204582.82068: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709330880> <<< 51385 1727204582.82112: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 51385 1727204582.82120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 51385 1727204582.82145: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092c8d90><<< 51385 1727204582.82150: stdout chunk (state=3): >>> <<< 51385 1727204582.82236: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 51385 1727204582.82239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 51385 1727204582.82296: stdout chunk (state=3): >>>import '_locale' # <<< 51385 1727204582.82299: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092f2d90> <<< 51385 1727204582.82392: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318970> <<< 51385 1727204582.82456: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) <<< 51385 1727204582.82462: stdout chunk (state=3): >>> [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux <<< 51385 1727204582.82467: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 51385 1727204582.83016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 51385 1727204582.83058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 51385 1727204582.83092: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 51385 1727204582.83116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc'<<< 51385 1727204582.83120: stdout chunk (state=3): >>> <<< 51385 1727204582.83158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 51385 1727204582.83192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 51385 1727204582.83231: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 51385 1727204582.83291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 51385 1727204582.83294: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926df10> <<< 51385 1727204582.83402: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092740a0><<< 51385 1727204582.83405: stdout chunk (state=3): >>> <<< 51385 1727204582.83407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 51385 1727204582.83470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 51385 1727204582.83473: stdout chunk (state=3): >>>import '_sre' # <<< 51385 1727204582.83604: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 51385 1727204582.83607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 51385 1727204582.83609: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 51385 1727204582.83611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc'<<< 51385 1727204582.83613: stdout chunk (state=3): >>> <<< 51385 1727204582.83678: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092675b0><<< 51385 1727204582.83682: stdout chunk (state=3): >>> <<< 51385 1727204582.83684: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926e6a0> <<< 51385 1727204582.83743: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926d3d0> <<< 51385 1727204582.83746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py<<< 51385 1727204582.83855: stdout chunk (state=3): >>> <<< 51385 1727204582.83859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc'<<< 51385 1727204582.83861: stdout chunk (state=3): >>> <<< 51385 1727204582.83942: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py<<< 51385 1727204582.83946: stdout chunk (state=3): >>> <<< 51385 1727204582.83948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc'<<< 51385 1727204582.83950: stdout chunk (state=3): >>> <<< 51385 1727204582.84001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py<<< 51385 1727204582.84005: stdout chunk (state=3): >>> <<< 51385 1727204582.84008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc'<<< 51385 1727204582.84010: stdout chunk (state=3): >>> <<< 51385 1727204582.84065: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204582.84102: stdout chunk (state=3): >>> <<< 51385 1727204582.84106: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204582.84109: stdout chunk (state=3): >>> import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708f4edf0><<< 51385 1727204582.84111: stdout chunk (state=3): >>> <<< 51385 1727204582.84113: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4e8e0><<< 51385 1727204582.84115: stdout chunk (state=3): >>> <<< 51385 1727204582.84173: stdout chunk (state=3): >>>import 'itertools' # <<< 51385 1727204582.84191: stdout chunk (state=3): >>> # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py<<< 51385 1727204582.84210: stdout chunk (state=3): >>> <<< 51385 1727204582.84214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 51385 1727204582.84216: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4eee0> <<< 51385 1727204582.84260: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 51385 1727204582.84290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 51385 1727204582.84323: stdout chunk (state=3): >>>import '_operator' # <<< 51385 1727204582.84340: stdout chunk (state=3): >>> <<< 51385 1727204582.84345: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4efa0> <<< 51385 1727204582.84387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 51385 1727204582.84419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 51385 1727204582.84428: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4eeb0> <<< 51385 1727204582.84460: stdout chunk (state=3): >>>import '_collections' # <<< 51385 1727204582.84556: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709249d60> <<< 51385 1727204582.84559: stdout chunk (state=3): >>>import '_functools' # <<< 51385 1727204582.84562: stdout chunk (state=3): >>> <<< 51385 1727204582.84804: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709242640> <<< 51385 1727204582.84807: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 51385 1727204582.84809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 51385 1727204582.84811: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092566a0> <<< 51385 1727204582.84813: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709275e80> <<< 51385 1727204582.84815: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py <<< 51385 1727204582.84817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 51385 1727204582.84857: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.84878: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.84882: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708f5fca0> <<< 51385 1727204582.84909: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709249280> <<< 51385 1727204582.84954: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.84992: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.85027: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37092562b0> <<< 51385 1727204582.85029: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370927ba30><<< 51385 1727204582.85043: stdout chunk (state=3): >>> <<< 51385 1727204582.85055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py<<< 51385 1727204582.85072: stdout chunk (state=3): >>> <<< 51385 1727204582.85081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc'<<< 51385 1727204582.85086: stdout chunk (state=3): >>> <<< 51385 1727204582.85118: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 51385 1727204582.85140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204582.85150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 51385 1727204582.85163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 51385 1727204582.85167: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5ffd0> <<< 51385 1727204582.85172: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fdc0> <<< 51385 1727204582.85369: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fd30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 51385 1727204582.85385: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f323a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 51385 1727204582.85393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 51385 1727204582.85440: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f32490> <<< 51385 1727204582.85621: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f67fd0> <<< 51385 1727204582.85678: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61a60> <<< 51385 1727204582.85688: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61580> <<< 51385 1727204582.85710: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 51385 1727204582.85716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 51385 1727204582.85770: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 51385 1727204582.85795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 51385 1727204582.85804: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 51385 1727204582.85807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e661f0> <<< 51385 1727204582.85862: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f1db80> <<< 51385 1727204582.85932: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61ee0> <<< 51385 1727204582.85937: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370927b0a0> <<< 51385 1727204582.85960: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 51385 1727204582.86003: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 51385 1727204582.86008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 51385 1727204582.86020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e78b20> <<< 51385 1727204582.86026: stdout chunk (state=3): >>>import 'errno' # <<< 51385 1727204582.86083: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e78e50> <<< 51385 1727204582.86086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 51385 1727204582.86091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 51385 1727204582.86133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 51385 1727204582.86136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 51385 1727204582.86140: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8a760> <<< 51385 1727204582.86171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 51385 1727204582.86209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 51385 1727204582.86238: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8aca0> <<< 51385 1727204582.86296: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e183d0> <<< 51385 1727204582.86299: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e78f40> <<< 51385 1727204582.86324: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 51385 1727204582.86327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 51385 1727204582.86390: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e282b0> <<< 51385 1727204582.86393: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8a5e0> import 'pwd' # <<< 51385 1727204582.86431: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.86434: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e28370> <<< 51385 1727204582.86478: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fa00> <<< 51385 1727204582.86498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 51385 1727204582.86513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 51385 1727204582.86541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 51385 1727204582.86553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 51385 1727204582.86591: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e446d0> <<< 51385 1727204582.86617: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 51385 1727204582.86655: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e449a0> <<< 51385 1727204582.86658: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44790> <<< 51385 1727204582.86692: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.86695: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e44880> <<< 51385 1727204582.86731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 51385 1727204582.86734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 51385 1727204582.87010: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e44cd0> <<< 51385 1727204582.87036: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.87040: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e50220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44910> <<< 51385 1727204582.87069: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e37a60> <<< 51385 1727204582.87093: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5f5e0> <<< 51385 1727204582.87121: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 51385 1727204582.87201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 51385 1727204582.87246: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44ac0> <<< 51385 1727204582.87447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 51385 1727204582.87478: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3708d6d6a0> <<< 51385 1727204582.87896: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip' <<< 51385 1727204582.87903: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.88040: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.88090: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 51385 1727204582.88106: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.88112: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 51385 1727204582.88135: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.90084: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.91632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087657f0> <<< 51385 1727204582.91637: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204582.91689: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 51385 1727204582.91718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 51385 1727204582.91722: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 51385 1727204582.91725: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708765160> <<< 51385 1727204582.91779: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765280> <<< 51385 1727204582.91833: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765f40> <<< 51385 1727204582.91836: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 51385 1727204582.91850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 51385 1727204582.91900: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087654f0> <<< 51385 1727204582.91906: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765d60> import 'atexit' # <<< 51385 1727204582.91940: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708765fa0> <<< 51385 1727204582.91969: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 51385 1727204582.91998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 51385 1727204582.92084: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765100> <<< 51385 1727204582.92094: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 51385 1727204582.92120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 51385 1727204582.92124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 51385 1727204582.92144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 51385 1727204582.92173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 51385 1727204582.92176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 51385 1727204582.92306: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708724f10> <<< 51385 1727204582.92346: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.92371: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370863f310> <<< 51385 1727204582.92412: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370863f2e0> <<< 51385 1727204582.92416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 51385 1727204582.92418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 51385 1727204582.92482: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370863fc70> <<< 51385 1727204582.92486: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874ddc0> <<< 51385 1727204582.92788: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874d3a0> <<< 51385 1727204582.92801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 51385 1727204582.92805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 51385 1727204582.92807: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874dfa0> <<< 51385 1727204582.92839: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 51385 1727204582.92889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 51385 1727204582.92922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 51385 1727204582.92925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc'<<< 51385 1727204582.92928: stdout chunk (state=3): >>> <<< 51385 1727204582.92960: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 51385 1727204582.92963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370879ac70> <<< 51385 1727204582.93074: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876cd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876c3d0> <<< 51385 1727204582.93080: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708718b50> <<< 51385 1727204582.93098: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.93129: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370876c4f0> <<< 51385 1727204582.93145: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py <<< 51385 1727204582.93148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876c520> <<< 51385 1727204582.93200: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 51385 1727204582.93210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 51385 1727204582.93213: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 51385 1727204582.93262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 51385 1727204582.93341: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.93376: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869d310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087ac220> <<< 51385 1727204582.93380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 51385 1727204582.93382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 51385 1727204582.93451: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.93454: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086ab880> <<< 51385 1727204582.93457: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087ac3a0> <<< 51385 1727204582.93478: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 51385 1727204582.93536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204582.93549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 51385 1727204582.93562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 51385 1727204582.93566: stdout chunk (state=3): >>>import '_string' # <<< 51385 1727204582.93658: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087acca0> <<< 51385 1727204582.93874: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ab820> <<< 51385 1727204582.93988: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708745af0> <<< 51385 1727204582.94017: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37087ac940> <<< 51385 1727204582.94077: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37087ac5b0> <<< 51385 1727204582.94083: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087a58e0> <<< 51385 1727204582.94110: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 51385 1727204582.94138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 51385 1727204582.94159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 51385 1727204582.94219: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.94222: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869f970> <<< 51385 1727204582.94518: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204582.94522: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086bcd60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086a95e0> <<< 51385 1727204582.94577: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204582.94591: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869ff10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086a99d0> # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204582.94595: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 51385 1727204582.94613: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.94723: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.94833: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.94837: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 51385 1727204582.94877: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.94881: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.94900: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 51385 1727204582.94903: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204582.94905: stdout chunk (state=3): >>> <<< 51385 1727204582.95053: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.95201: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.95930: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.96711: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 51385 1727204582.96742: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 51385 1727204582.96746: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 51385 1727204582.96749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204582.96794: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086e57f0> <<< 51385 1727204582.96892: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 51385 1727204582.96895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ea880> <<< 51385 1727204582.96898: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37082519a0> <<< 51385 1727204582.96978: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 51385 1727204582.96982: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.96984: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.97022: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 51385 1727204582.97033: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.97213: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.97403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 51385 1727204582.97438: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708723730> <<< 51385 1727204582.97453: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98070: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98661: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98740: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98827: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 51385 1727204582.98832: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98887: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.98927: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 51385 1727204582.98932: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99014: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99142: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 51385 1727204582.99160: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51385 1727204582.99167: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 51385 1727204582.99172: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99222: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99283: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 51385 1727204582.99286: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99568: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204582.99873: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 51385 1727204582.99903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 51385 1727204582.99909: stdout chunk (state=3): >>>import '_ast' # <<< 51385 1727204583.00010: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087683a0> <<< 51385 1727204583.00016: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00108: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00203: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 51385 1727204583.00207: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 51385 1727204583.00210: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 51385 1727204583.00235: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204583.00239: stdout chunk (state=3): >>> <<< 51385 1727204583.00290: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00337: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 51385 1727204583.00342: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00399: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00452: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00577: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.00659: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 51385 1727204583.00698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.00807: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204583.00816: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086dca00> <<< 51385 1727204583.00903: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37080cd4c0> <<< 51385 1727204583.00949: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 51385 1727204583.00956: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.01035: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.01114: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.01141: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.01195: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 51385 1727204583.01201: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 51385 1727204583.01234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 51385 1727204583.01279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 51385 1727204583.01305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 51385 1727204583.01329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 51385 1727204583.01465: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ed6a0> <<< 51385 1727204583.01505: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708737e50> <<< 51385 1727204583.01589: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708768ca0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py<<< 51385 1727204583.01592: stdout chunk (state=3): >>> # zipimport: zlib available <<< 51385 1727204583.01618: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.01655: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 51385 1727204583.02363: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.02409: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.02449: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.02498: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 51385 1727204583.02515: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.02742: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.02978: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.03024: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.03095: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 51385 1727204583.03099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.03142: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 51385 1727204583.03170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 51385 1727204583.03201: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 51385 1727204583.03211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 51385 1727204583.03270: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370827e940> <<< 51385 1727204583.03314: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 51385 1727204583.03318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 51385 1727204583.03341: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 51385 1727204583.03412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 51385 1727204583.03450: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 51385 1727204583.03471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 51385 1727204583.03497: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708231a30> <<< 51385 1727204583.03602: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204583.03606: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37082319a0> <<< 51385 1727204583.03662: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708266040> <<< 51385 1727204583.03725: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370827eaf0> <<< 51385 1727204583.03729: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2fa0> <<< 51385 1727204583.03756: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2be0> <<< 51385 1727204583.03785: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 51385 1727204583.03842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 51385 1727204583.03846: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 51385 1727204583.03871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 51385 1727204583.03945: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708748d00> <<< 51385 1727204583.03981: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708216e80><<< 51385 1727204583.03998: stdout chunk (state=3): >>> <<< 51385 1727204583.04056: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 51385 1727204583.04319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087480d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 51385 1727204583.04330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370803afd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708262e50> <<< 51385 1727204583.04366: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2e50> <<< 51385 1727204583.04394: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 51385 1727204583.04452: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 51385 1727204583.04457: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.04521: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 51385 1727204583.04670: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.04812: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 51385 1727204583.04950: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.04961: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 51385 1727204583.05017: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.05041: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.05423: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.05435: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.05491: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.05601: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 51385 1727204583.05611: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06050: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06296: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 51385 1727204583.06338: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06388: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06414: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06454: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 51385 1727204583.06478: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06492: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06524: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 51385 1727204583.06527: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06578: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06629: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 51385 1727204583.06645: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06660: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06691: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 51385 1727204583.06720: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06749: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 51385 1727204583.06764: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06816: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.06917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 51385 1727204583.06940: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f53e50> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 51385 1727204583.06978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 51385 1727204583.07586: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f539d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 51385 1727204583.07650: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.07734: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 51385 1727204583.07737: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.07791: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.07841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 51385 1727204583.07878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 51385 1727204583.08082: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f4a880> <<< 51385 1727204583.08573: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37082567f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 51385 1727204583.08588: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.08658: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.08716: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 51385 1727204583.08733: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.08835: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.08948: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.09099: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.09315: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 51385 1727204583.09334: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.09370: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.09494: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.10691: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f0b310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f0b340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 51385 1727204583.10697: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 51385 1727204583.10702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.10705: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.10722: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 51385 1727204583.10825: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.10936: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 51385 1727204583.10939: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.10974: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.10997: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.11445: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12074: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.12194: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 51385 1727204583.12201: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12340: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12469: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 51385 1727204583.12472: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12856: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12876: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 51385 1727204583.12891: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.12905: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 51385 1727204583.12913: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.12974: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.13021: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 51385 1727204583.13037: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.13491: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.13881: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 51385 1727204583.13888: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.13941: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.13989: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 51385 1727204583.14008: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14033: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14081: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 51385 1727204583.14084: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14187: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14288: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 51385 1727204583.14298: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14328: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14369: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 51385 1727204583.14391: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.14844: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204583.15056: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 51385 1727204583.15111: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15171: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 51385 1727204583.15181: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15196: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15235: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 51385 1727204583.15278: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15294: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 51385 1727204583.15308: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15334: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15372: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 51385 1727204583.15449: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15538: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 51385 1727204583.15550: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 51385 1727204583.15595: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15631: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 51385 1727204583.15644: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15669: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15684: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15720: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15767: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15824: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15896: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 51385 1727204583.15903: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15954: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.15998: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 51385 1727204583.16004: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16172: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16327: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 51385 1727204583.16377: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16420: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 51385 1727204583.16427: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16467: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16555: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 51385 1727204583.16596: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.16667: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 51385 1727204583.17533: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available <<< 51385 1727204583.17726: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 51385 1727204583.17773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 51385 1727204583.17802: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f2d190> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f2d730> <<< 51385 1727204583.17962: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707ebf580> <<< 51385 1727204583.18205: stdout chunk (state=3): >>>import 'gc' # <<< 51385 1727204583.24628: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 51385 1727204583.24673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f2ddc0> <<< 51385 1727204583.24704: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 51385 1727204583.24707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707d2c4c0> <<< 51385 1727204583.24758: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.24816: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707cfccd0> <<< 51385 1727204583.24822: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707cfc190> <<< 51385 1727204583.25050: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 51385 1727204583.45540: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTK<<< 51385 1727204583.45615: stdout chunk (state=3): >>>V481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.43, "5m": 0.43, "15m": 0.29}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "03", "epoch": "1727204583", "epoch_int": "1727204583", "date": "2024-09-24", "time": "15:03:03", "iso8601_micro": "2024-09-24T19:03:03.185185Z", "iso8601": "2024-09-24T19:03:03Z", "iso8601_basic": "20240924T150303185185", "iso8601_basic_short": "20240924T150303", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation":<<< 51385 1727204583.45623: stdout chunk (state=3): >>> "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmen<<< 51385 1727204583.45632: stdout chunk (state=3): >>>tation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2769, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 763, "free": 2769}, "nocache": {"free": 3246, "used": 286}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, <<< 51385 1727204583.45654: stdout chunk (state=3): >>>"model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 846, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266788864, "block_size": 4096, "block_total": 65519355, "block_available": 64518259, "block_used": 1001096, "inode_total": 131071472, "inode_available": 130998222, "inode_used": 73250, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 51385 1727204583.46229: stdout chunk (state=3): >>># clear builtins._ <<< 51385 1727204583.46312: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 51385 1727204583.46408: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 51385 1727204583.46448: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 <<< 51385 1727204583.46580: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 51385 1727204583.46668: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 51385 1727204583.46732: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib <<< 51385 1727204583.46750: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 51385 1727204583.46755: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util <<< 51385 1727204583.46772: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 51385 1727204583.46776: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 51385 1727204583.46778: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 51385 1727204583.46780: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 51385 1727204583.46783: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner<<< 51385 1727204583.46785: stdout chunk (state=3): >>> # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select <<< 51385 1727204583.46788: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback <<< 51385 1727204583.46790: stdout chunk (state=3): >>># cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 51385 1727204583.46792: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 51385 1727204583.46793: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes<<< 51385 1727204583.46795: stdout chunk (state=3): >>> # cleanup[2] removing ctypes._endian <<< 51385 1727204583.46796: stdout chunk (state=3): >>># cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 51385 1727204583.46797: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 51385 1727204583.46801: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 51385 1727204583.46803: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 51385 1727204583.46805: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 51385 1727204583.46811: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ <<< 51385 1727204583.46812: stdout chunk (state=3): >>># destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime <<< 51385 1727204583.46814: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser<<< 51385 1727204583.46815: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version <<< 51385 1727204583.46816: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 51385 1727204583.46817: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network<<< 51385 1727204583.46818: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual<<< 51385 1727204583.46820: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos <<< 51385 1727204583.46821: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts <<< 51385 1727204583.46822: stdout chunk (state=3): >>># destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 51385 1727204583.46823: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils <<< 51385 1727204583.46824: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd <<< 51385 1727204583.46826: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network <<< 51385 1727204583.46828: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd<<< 51385 1727204583.46829: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly<<< 51385 1727204583.46830: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep <<< 51385 1727204583.46831: stdout chunk (state=3): >>># cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 51385 1727204583.47128: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 51385 1727204583.47157: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc <<< 51385 1727204583.47165: stdout chunk (state=3): >>># destroy importlib.machinery <<< 51385 1727204583.47200: stdout chunk (state=3): >>># destroy zipimport <<< 51385 1727204583.47222: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib<<< 51385 1727204583.47255: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma <<< 51385 1727204583.47277: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 51385 1727204583.47290: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy encodings <<< 51385 1727204583.47321: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 51385 1727204583.47408: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 51385 1727204583.47486: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 51385 1727204583.47496: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue <<< 51385 1727204583.47499: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 51385 1727204583.47502: stdout chunk (state=3): >>># destroy shlex <<< 51385 1727204583.47527: stdout chunk (state=3): >>># destroy datetime <<< 51385 1727204583.47542: stdout chunk (state=3): >>># destroy base64 <<< 51385 1727204583.47570: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 51385 1727204583.47584: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 51385 1727204583.47626: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout<<< 51385 1727204583.47641: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 51385 1727204583.47697: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl <<< 51385 1727204583.47757: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 51385 1727204583.47807: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 51385 1727204583.47827: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 51385 1727204583.47851: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 51385 1727204583.47875: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 51385 1727204583.47892: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 51385 1727204583.47991: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 51385 1727204583.48010: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale <<< 51385 1727204583.48013: stdout chunk (state=3): >>># cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 51385 1727204583.48016: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 51385 1727204583.48018: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 <<< 51385 1727204583.48020: stdout chunk (state=3): >>># cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 51385 1727204583.48022: stdout chunk (state=3): >>># cleanup[3] wiping marshal<<< 51385 1727204583.48026: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 51385 1727204583.48027: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 51385 1727204583.48058: stdout chunk (state=3): >>># destroy gc # destroy unicodedata <<< 51385 1727204583.48082: stdout chunk (state=3): >>># destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 51385 1727204583.48298: stdout chunk (state=3): >>># destroy platform <<< 51385 1727204583.48329: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 51385 1727204583.48358: stdout chunk (state=3): >>># destroy _heapq <<< 51385 1727204583.48361: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 51385 1727204583.48383: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib <<< 51385 1727204583.48397: stdout chunk (state=3): >>># destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 51385 1727204583.48400: stdout chunk (state=3): >>># destroy select <<< 51385 1727204583.48402: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 51385 1727204583.48404: stdout chunk (state=3): >>> # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 51385 1727204583.48430: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp <<< 51385 1727204583.48435: stdout chunk (state=3): >>># destroy io # destroy marshal <<< 51385 1727204583.48482: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 51385 1727204583.48946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.48955: stderr chunk (state=3): >>>Shared connection to 10.31.9.148 closed. <<< 51385 1727204583.49007: stderr chunk (state=3): >>><<< 51385 1727204583.49010: stdout chunk (state=3): >>><<< 51385 1727204583.49141: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37093183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709373ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709330880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709318970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370926d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708f4edf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4e8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4eee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4efa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f4eeb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709249d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709242640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37092566a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709275e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708f5fca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3709249280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37092562b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370927ba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5ffd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fdc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fd30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f323a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f32490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f67fd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e661f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f1db80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f61ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370927b0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e78b20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e78e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8a760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8aca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e183d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e78f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e282b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e8a5e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e28370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5fa00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e446d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e449a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e44880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e44cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708e50220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e37a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708f5f5e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708e44ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3708d6d6a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087657f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708765160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765f40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087654f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765d60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708765fa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708765100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708724f10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370863f310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370863f2e0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370863fc70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874ddc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874d3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370874dfa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370879ac70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876cd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876c3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708718b50> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370876c4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370876c520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869d310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087ac220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086ab880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087ac3a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087acca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ab820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708745af0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37087ac940> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37087ac5b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087a58e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869f970> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086bcd60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086a95e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370869ff10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086a99d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086e57f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ea880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37082519a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708723730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087683a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37086dca00> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37080cd4c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37086ed6a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708737e50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708768ca0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370827e940> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708231a30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f37082319a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708266040> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f370827eaf0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2be0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3708748d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708216e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37087480d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f370803afd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3708262e50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707fd2e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f53e50> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f539d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f4a880> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f37082567f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f0b310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f0b340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_cyq6lack/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3707f2d190> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f2d730> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707ebf580> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707f2ddc0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707d2c4c0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707cfccd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3707cfc190> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.43, "5m": 0.43, "15m": 0.29}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "03", "epoch": "1727204583", "epoch_int": "1727204583", "date": "2024-09-24", "time": "15:03:03", "iso8601_micro": "2024-09-24T19:03:03.185185Z", "iso8601": "2024-09-24T19:03:03Z", "iso8601_basic": "20240924T150303185185", "iso8601_basic_short": "20240924T150303", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2769, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 763, "free": 2769}, "nocache": {"free": 3246, "used": 286}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 846, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266788864, "block_size": 4096, "block_total": 65519355, "block_available": 64518259, "block_used": 1001096, "inode_total": 131071472, "inode_available": 130998222, "inode_used": 73250, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 51385 1727204583.50515: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204583.50519: _low_level_execute_command(): starting 51385 1727204583.50595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204581.6855466-51486-280200456964861/ > /dev/null 2>&1 && sleep 0' 51385 1727204583.50998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.51002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.51035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204583.51038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.51040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.51125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.51128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.51130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.51186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.53702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.53780: stderr chunk (state=3): >>><<< 51385 1727204583.53784: stdout chunk (state=3): >>><<< 51385 1727204583.54073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204583.54076: handler run complete 51385 1727204583.54079: variable 'ansible_facts' from source: unknown 51385 1727204583.54081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.54430: variable 'ansible_facts' from source: unknown 51385 1727204583.54634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.54726: attempt loop complete, returning result 51385 1727204583.54733: _execute() done 51385 1727204583.54735: dumping result to json 51385 1727204583.54759: done dumping result, returning 51385 1727204583.54769: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-6b1f-5706-0000000000af] 51385 1727204583.54775: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000af 51385 1727204583.55063: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000af 51385 1727204583.55068: WORKER PROCESS EXITING ok: [managed-node1] 51385 1727204583.55316: no more pending results, returning what we have 51385 1727204583.55318: results queue empty 51385 1727204583.55319: checking for any_errors_fatal 51385 1727204583.55320: done checking for any_errors_fatal 51385 1727204583.55320: checking for max_fail_percentage 51385 1727204583.55322: done checking for max_fail_percentage 51385 1727204583.55322: checking to see if all hosts have failed and the running result is not ok 51385 1727204583.55323: done checking to see if all hosts have failed 51385 1727204583.55323: getting the remaining hosts for this loop 51385 1727204583.55324: done getting the remaining hosts for this loop 51385 1727204583.55327: getting the next task for host managed-node1 51385 1727204583.55332: done getting next task for host managed-node1 51385 1727204583.55333: ^ task is: TASK: meta (flush_handlers) 51385 1727204583.55334: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204583.55337: getting variables 51385 1727204583.55338: in VariableManager get_vars() 51385 1727204583.55357: Calling all_inventory to load vars for managed-node1 51385 1727204583.55359: Calling groups_inventory to load vars for managed-node1 51385 1727204583.55362: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204583.55371: Calling all_plugins_play to load vars for managed-node1 51385 1727204583.55373: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204583.55375: Calling groups_plugins_play to load vars for managed-node1 51385 1727204583.55500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.55625: done with get_vars() 51385 1727204583.55633: done getting variables 51385 1727204583.55688: in VariableManager get_vars() 51385 1727204583.55695: Calling all_inventory to load vars for managed-node1 51385 1727204583.55696: Calling groups_inventory to load vars for managed-node1 51385 1727204583.55698: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204583.55701: Calling all_plugins_play to load vars for managed-node1 51385 1727204583.55702: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204583.55704: Calling groups_plugins_play to load vars for managed-node1 51385 1727204583.55795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.55914: done with get_vars() 51385 1727204583.55924: done queuing things up, now waiting for results queue to drain 51385 1727204583.55926: results queue empty 51385 1727204583.55927: checking for any_errors_fatal 51385 1727204583.55928: done checking for any_errors_fatal 51385 1727204583.55929: checking for max_fail_percentage 51385 1727204583.55929: done checking for max_fail_percentage 51385 1727204583.55933: checking to see if all hosts have failed and the running result is not ok 51385 1727204583.55933: done checking to see if all hosts have failed 51385 1727204583.55934: getting the remaining hosts for this loop 51385 1727204583.55935: done getting the remaining hosts for this loop 51385 1727204583.55936: getting the next task for host managed-node1 51385 1727204583.55940: done getting next task for host managed-node1 51385 1727204583.55941: ^ task is: TASK: Include the task 'el_repo_setup.yml' 51385 1727204583.55942: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204583.55944: getting variables 51385 1727204583.55944: in VariableManager get_vars() 51385 1727204583.55949: Calling all_inventory to load vars for managed-node1 51385 1727204583.55951: Calling groups_inventory to load vars for managed-node1 51385 1727204583.55952: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204583.55955: Calling all_plugins_play to load vars for managed-node1 51385 1727204583.55956: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204583.55958: Calling groups_plugins_play to load vars for managed-node1 51385 1727204583.56060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.56177: done with get_vars() 51385 1727204583.56183: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Tuesday 24 September 2024 15:03:03 -0400 (0:00:01.933) 0:00:01.966 ***** 51385 1727204583.56239: entering _queue_task() for managed-node1/include_tasks 51385 1727204583.56240: Creating lock for include_tasks 51385 1727204583.56663: worker is 1 (out of 1 available) 51385 1727204583.56676: exiting _queue_task() for managed-node1/include_tasks 51385 1727204583.56688: done queuing things up, now waiting for results queue to drain 51385 1727204583.56690: waiting for pending results... 51385 1727204583.56936: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 51385 1727204583.57032: in run() - task 0affcd87-79f5-6b1f-5706-000000000006 51385 1727204583.57051: variable 'ansible_search_path' from source: unknown 51385 1727204583.57092: calling self._execute() 51385 1727204583.57161: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204583.57175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204583.57189: variable 'omit' from source: magic vars 51385 1727204583.57291: _execute() done 51385 1727204583.57299: dumping result to json 51385 1727204583.57305: done dumping result, returning 51385 1727204583.57315: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-6b1f-5706-000000000006] 51385 1727204583.57326: sending task result for task 0affcd87-79f5-6b1f-5706-000000000006 51385 1727204583.57432: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000006 51385 1727204583.57439: WORKER PROCESS EXITING 51385 1727204583.57498: no more pending results, returning what we have 51385 1727204583.57504: in VariableManager get_vars() 51385 1727204583.57538: Calling all_inventory to load vars for managed-node1 51385 1727204583.57541: Calling groups_inventory to load vars for managed-node1 51385 1727204583.57545: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204583.57558: Calling all_plugins_play to load vars for managed-node1 51385 1727204583.57562: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204583.57567: Calling groups_plugins_play to load vars for managed-node1 51385 1727204583.57749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.57946: done with get_vars() 51385 1727204583.57954: variable 'ansible_search_path' from source: unknown 51385 1727204583.57973: we have included files to process 51385 1727204583.57975: generating all_blocks data 51385 1727204583.57976: done generating all_blocks data 51385 1727204583.57977: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51385 1727204583.57979: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51385 1727204583.57982: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51385 1727204583.58905: in VariableManager get_vars() 51385 1727204583.58922: done with get_vars() 51385 1727204583.58933: done processing included file 51385 1727204583.58935: iterating over new_blocks loaded from include file 51385 1727204583.58937: in VariableManager get_vars() 51385 1727204583.58946: done with get_vars() 51385 1727204583.58948: filtering new block on tags 51385 1727204583.58965: done filtering new block on tags 51385 1727204583.58968: in VariableManager get_vars() 51385 1727204583.58979: done with get_vars() 51385 1727204583.58980: filtering new block on tags 51385 1727204583.58996: done filtering new block on tags 51385 1727204583.58998: in VariableManager get_vars() 51385 1727204583.59008: done with get_vars() 51385 1727204583.59010: filtering new block on tags 51385 1727204583.59023: done filtering new block on tags 51385 1727204583.59025: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 51385 1727204583.59031: extending task lists for all hosts with included blocks 51385 1727204583.59083: done extending task lists 51385 1727204583.59084: done processing included files 51385 1727204583.59085: results queue empty 51385 1727204583.59086: checking for any_errors_fatal 51385 1727204583.59087: done checking for any_errors_fatal 51385 1727204583.59088: checking for max_fail_percentage 51385 1727204583.59089: done checking for max_fail_percentage 51385 1727204583.59090: checking to see if all hosts have failed and the running result is not ok 51385 1727204583.59090: done checking to see if all hosts have failed 51385 1727204583.59091: getting the remaining hosts for this loop 51385 1727204583.59092: done getting the remaining hosts for this loop 51385 1727204583.59095: getting the next task for host managed-node1 51385 1727204583.59099: done getting next task for host managed-node1 51385 1727204583.59101: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 51385 1727204583.59103: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204583.59106: getting variables 51385 1727204583.59107: in VariableManager get_vars() 51385 1727204583.59115: Calling all_inventory to load vars for managed-node1 51385 1727204583.59118: Calling groups_inventory to load vars for managed-node1 51385 1727204583.59120: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204583.59126: Calling all_plugins_play to load vars for managed-node1 51385 1727204583.59128: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204583.59131: Calling groups_plugins_play to load vars for managed-node1 51385 1727204583.59297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204583.59490: done with get_vars() 51385 1727204583.59498: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.033) 0:00:01.999 ***** 51385 1727204583.59565: entering _queue_task() for managed-node1/setup 51385 1727204583.59842: worker is 1 (out of 1 available) 51385 1727204583.59854: exiting _queue_task() for managed-node1/setup 51385 1727204583.59865: done queuing things up, now waiting for results queue to drain 51385 1727204583.59866: waiting for pending results... 51385 1727204583.60105: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 51385 1727204583.60458: in run() - task 0affcd87-79f5-6b1f-5706-0000000000c0 51385 1727204583.60479: variable 'ansible_search_path' from source: unknown 51385 1727204583.60487: variable 'ansible_search_path' from source: unknown 51385 1727204583.60531: calling self._execute() 51385 1727204583.60604: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204583.60615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204583.60630: variable 'omit' from source: magic vars 51385 1727204583.61189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204583.63392: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204583.63447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204583.63494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204583.63695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204583.63728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204583.63810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204583.63842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204583.63874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204583.63920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204583.63940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204583.64141: variable 'ansible_facts' from source: unknown 51385 1727204583.64237: variable 'network_test_required_facts' from source: task vars 51385 1727204583.64435: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 51385 1727204583.64447: variable 'omit' from source: magic vars 51385 1727204583.64486: variable 'omit' from source: magic vars 51385 1727204583.64527: variable 'omit' from source: magic vars 51385 1727204583.64554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204583.64587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204583.64611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204583.64630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204583.64643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204583.64677: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204583.64685: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204583.64691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204583.64781: Set connection var ansible_pipelining to False 51385 1727204583.64789: Set connection var ansible_shell_type to sh 51385 1727204583.64804: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204583.64818: Set connection var ansible_timeout to 10 51385 1727204583.64825: Set connection var ansible_connection to ssh 51385 1727204583.64835: Set connection var ansible_shell_executable to /bin/sh 51385 1727204583.64863: variable 'ansible_shell_executable' from source: unknown 51385 1727204583.64872: variable 'ansible_connection' from source: unknown 51385 1727204583.64877: variable 'ansible_module_compression' from source: unknown 51385 1727204583.64882: variable 'ansible_shell_type' from source: unknown 51385 1727204583.64887: variable 'ansible_shell_executable' from source: unknown 51385 1727204583.64892: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204583.64897: variable 'ansible_pipelining' from source: unknown 51385 1727204583.64901: variable 'ansible_timeout' from source: unknown 51385 1727204583.64906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204583.65035: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204583.65053: variable 'omit' from source: magic vars 51385 1727204583.65062: starting attempt loop 51385 1727204583.65072: running the handler 51385 1727204583.65090: _low_level_execute_command(): starting 51385 1727204583.65101: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204583.66626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.66631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.66670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.66674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.66677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.66759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.66762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.66770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.66831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.68928: stdout chunk (state=3): >>>/root <<< 51385 1727204583.69159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.69163: stdout chunk (state=3): >>><<< 51385 1727204583.69168: stderr chunk (state=3): >>><<< 51385 1727204583.69301: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204583.69314: _low_level_execute_command(): starting 51385 1727204583.69317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622 `" && echo ansible-tmp-1727204583.6919074-51577-84965345317622="` echo /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622 `" ) && sleep 0' 51385 1727204583.70092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204583.70107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.70121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.70140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.70190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.70201: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204583.70214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.70232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204583.70243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204583.70254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204583.70268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.70288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.70303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.70315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.70326: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204583.70338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.70429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.70462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.70490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.70583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.73171: stdout chunk (state=3): >>>ansible-tmp-1727204583.6919074-51577-84965345317622=/root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622 <<< 51385 1727204583.73323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.73409: stderr chunk (state=3): >>><<< 51385 1727204583.73422: stdout chunk (state=3): >>><<< 51385 1727204583.73575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204583.6919074-51577-84965345317622=/root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204583.73578: variable 'ansible_module_compression' from source: unknown 51385 1727204583.73581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 51385 1727204583.73684: variable 'ansible_facts' from source: unknown 51385 1727204583.73813: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/AnsiballZ_setup.py 51385 1727204583.73990: Sending initial data 51385 1727204583.73993: Sent initial data (153 bytes) 51385 1727204583.75051: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204583.75075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.75094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.75121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.75170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.75183: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204583.75198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.75226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204583.75240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204583.75252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204583.75270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.75285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.75300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.75320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.75332: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204583.75345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.75433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.75456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.75480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.75584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.77280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204583.77335: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204583.77396: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpqkpzr5y_ /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/AnsiballZ_setup.py <<< 51385 1727204583.77448: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204583.80039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.80376: stderr chunk (state=3): >>><<< 51385 1727204583.80381: stdout chunk (state=3): >>><<< 51385 1727204583.80384: done transferring module to remote 51385 1727204583.80387: _low_level_execute_command(): starting 51385 1727204583.80390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/ /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/AnsiballZ_setup.py && sleep 0' 51385 1727204583.82397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.82401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.82435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204583.82438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 51385 1727204583.82441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.82444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.82508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.82548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.82551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.82710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.84349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204583.84433: stderr chunk (state=3): >>><<< 51385 1727204583.84437: stdout chunk (state=3): >>><<< 51385 1727204583.84533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204583.84537: _low_level_execute_command(): starting 51385 1727204583.84539: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/AnsiballZ_setup.py && sleep 0' 51385 1727204583.85765: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204583.85877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.85894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.85929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.86123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.86153: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204583.86209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.86247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204583.86306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204583.86359: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204583.86375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204583.86413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204583.86429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204583.86442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204583.86453: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204583.86470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204583.86566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204583.86603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204583.86682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204583.86880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204583.88814: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 51385 1727204583.88818: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 51385 1727204583.88893: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 51385 1727204583.88912: stdout chunk (state=3): >>>import 'posix' # <<< 51385 1727204583.88941: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 51385 1727204583.88993: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 51385 1727204583.89042: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.89086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 51385 1727204583.89112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 51385 1727204583.89132: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698dc0> <<< 51385 1727204583.89159: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 51385 1727204583.89210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698b20> <<< 51385 1727204583.89239: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698ac0> <<< 51385 1727204583.89255: stdout chunk (state=3): >>>import '_signal' # <<< 51385 1727204583.89291: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d490> <<< 51385 1727204583.89355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 51385 1727204583.89359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d940> <<< 51385 1727204583.89375: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d670> <<< 51385 1727204583.89408: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 51385 1727204583.89427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 51385 1727204583.89462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 51385 1727204583.89482: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 51385 1727204583.89516: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf190> <<< 51385 1727204583.89547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 51385 1727204583.89569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 51385 1727204583.89629: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf220> <<< 51385 1727204583.89706: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 51385 1727204583.89710: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf940> <<< 51385 1727204583.89729: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2655880> <<< 51385 1727204583.89784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23c8d90> <<< 51385 1727204583.89836: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23f2d90> <<< 51385 1727204583.89913: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d970> <<< 51385 1727204583.89917: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 51385 1727204583.90259: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 51385 1727204583.90277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 51385 1727204583.90298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 51385 1727204583.90324: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 51385 1727204583.90368: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 51385 1727204583.90371: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2393f10> <<< 51385 1727204583.90412: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23990a0> <<< 51385 1727204583.90515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 51385 1727204583.90594: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e238c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23946a0> <<< 51385 1727204583.90619: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23933d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 51385 1727204583.90818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.90840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e227ae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227a940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227af40> <<< 51385 1727204583.90927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227ad90> <<< 51385 1727204583.90972: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228b100> import '_collections' # <<< 51385 1727204583.91027: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e236edc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23676a0> <<< 51385 1727204583.91202: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e237a700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e239ae50> <<< 51385 1727204583.91207: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e228bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e236e2e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e237a310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23a0a00> <<< 51385 1727204583.91392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 51385 1727204583.91424: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228bee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228be20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228bd90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 51385 1727204583.91519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e225e400> <<< 51385 1727204583.91572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e225e4f0> <<< 51385 1727204583.91684: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2293f70> <<< 51385 1727204583.91739: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228dac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228d490> <<< 51385 1727204583.91798: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 51385 1727204583.91844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2192250> <<< 51385 1727204583.91920: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2249550> <<< 51385 1727204583.91946: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228df40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23a00a0> <<< 51385 1727204583.92079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 51385 1727204583.92124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21a4b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e21a4eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b67c0> <<< 51385 1727204583.92148: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 51385 1727204583.92204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b6d00> <<< 51385 1727204583.92243: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e214e430> <<< 51385 1727204583.92319: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21a4fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 51385 1727204583.92340: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e215f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b6640> <<< 51385 1727204583.92357: stdout chunk (state=3): >>>import 'pwd' # <<< 51385 1727204583.92383: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e215f3d0> <<< 51385 1727204583.92422: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228ba60> <<< 51385 1727204583.92441: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 51385 1727204583.92480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 51385 1727204583.92513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 51385 1727204583.92541: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217b730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 51385 1727204583.92575: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204583.92602: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217b7f0> <<< 51385 1727204583.92631: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217b8e0> <<< 51385 1727204583.92645: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 51385 1727204583.92826: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217bd30> <<< 51385 1727204583.92881: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e2185280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217b970> <<< 51385 1727204583.92900: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e216eac0> <<< 51385 1727204583.92912: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228b640> <<< 51385 1727204583.92942: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 51385 1727204583.92989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 51385 1727204583.93025: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217bb20> <<< 51385 1727204583.93179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 51385 1727204583.93197: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f87e20a9700> <<< 51385 1727204583.93469: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip' # zipimport: zlib available <<< 51385 1727204583.93560: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.93612: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/__init__.py <<< 51385 1727204583.93641: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 51385 1727204583.93654: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.95238: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204583.96755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 51385 1727204583.96791: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.96807: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 51385 1727204583.96831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 51385 1727204583.96868: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fe6160> <<< 51385 1727204583.97090: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 51385 1727204583.97192: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fe6580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 51385 1727204583.97290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 51385 1727204583.97383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 51385 1727204583.97490: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f7b0a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1989370> <<< 51385 1727204583.97566: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1989070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 51385 1727204583.97604: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1989cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fcedc0> <<< 51385 1727204583.97736: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fce3a0> <<< 51385 1727204583.97746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 51385 1727204583.97767: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fcef40> <<< 51385 1727204583.97853: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 51385 1727204583.97865: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 51385 1727204583.97907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 51385 1727204583.97917: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e201df40> <<< 51385 1727204583.97989: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe3af0> <<< 51385 1727204583.98026: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fa4550> <<< 51385 1727204583.98055: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4580> <<< 51385 1727204583.98086: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 51385 1727204583.98104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 51385 1727204583.98143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 51385 1727204583.98212: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19f4fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202f280> <<< 51385 1727204583.98256: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 51385 1727204583.98259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 51385 1727204583.98307: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19f1820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202f400> <<< 51385 1727204583.98331: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 51385 1727204583.98405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204583.98408: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 51385 1727204583.98467: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202fc40> <<< 51385 1727204583.98594: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f17c0> <<< 51385 1727204583.98688: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fc71c0> <<< 51385 1727204583.98734: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e202f9d0> <<< 51385 1727204583.98805: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e202f550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2028940> <<< 51385 1727204583.98818: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 51385 1727204583.98840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 51385 1727204583.98889: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19e7910> <<< 51385 1727204583.99482: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f3edc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f0550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19e7eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f0970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 51385 1727204583.99649: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.00354: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.00481: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.01269: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py<<< 51385 1727204584.01314: stdout chunk (state=3): >>> <<< 51385 1727204584.01318: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 51385 1727204584.01321: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 51385 1727204584.01323: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py<<< 51385 1727204584.01325: stdout chunk (state=3): >>> <<< 51385 1727204584.01366: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 51385 1727204584.01388: stdout chunk (state=3): >>> <<< 51385 1727204584.01391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 51385 1727204584.01396: stdout chunk (state=3): >>> <<< 51385 1727204584.01503: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.01507: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.01509: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f3b7f0><<< 51385 1727204584.01518: stdout chunk (state=3): >>> <<< 51385 1727204584.01613: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f788b0> <<< 51385 1727204584.01645: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1591940> <<< 51385 1727204584.01681: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 51385 1727204584.01707: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51385 1727204584.01728: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 51385 1727204584.01849: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.01977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 51385 1727204584.01996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 51385 1727204584.02008: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa3730> # zipimport: zlib available <<< 51385 1727204584.02390: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.02969: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.03325: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available<<< 51385 1727204584.03332: stdout chunk (state=3): >>> <<< 51385 1727204584.03438: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/errors.py<<< 51385 1727204584.03456: stdout chunk (state=3): >>> <<< 51385 1727204584.03483: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.03490: stdout chunk (state=3): >>> <<< 51385 1727204584.03508: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.03518: stdout chunk (state=3): >>> <<< 51385 1727204584.03535: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py<<< 51385 1727204584.03540: stdout chunk (state=3): >>> <<< 51385 1727204584.03570: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.03577: stdout chunk (state=3): >>> <<< 51385 1727204584.03630: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.03635: stdout chunk (state=3): >>> <<< 51385 1727204584.03697: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py<<< 51385 1727204584.03705: stdout chunk (state=3): >>> <<< 51385 1727204584.03720: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.03729: stdout chunk (state=3): >>> <<< 51385 1727204584.04035: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.04048: stdout chunk (state=3): >>> <<< 51385 1727204584.04370: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py<<< 51385 1727204584.04378: stdout chunk (state=3): >>> <<< 51385 1727204584.04423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc'<<< 51385 1727204584.04435: stdout chunk (state=3): >>> <<< 51385 1727204584.04456: stdout chunk (state=3): >>>import '_ast' # <<< 51385 1727204584.04461: stdout chunk (state=3): >>> <<< 51385 1727204584.04569: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe92e0><<< 51385 1727204584.04577: stdout chunk (state=3): >>> <<< 51385 1727204584.04592: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.04599: stdout chunk (state=3): >>> <<< 51385 1727204584.04697: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.04704: stdout chunk (state=3): >>> <<< 51385 1727204584.04798: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py<<< 51385 1727204584.04806: stdout chunk (state=3): >>> <<< 51385 1727204584.04823: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/validation.py<<< 51385 1727204584.04828: stdout chunk (state=3): >>> <<< 51385 1727204584.04849: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py<<< 51385 1727204584.04856: stdout chunk (state=3): >>> <<< 51385 1727204584.04878: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py<<< 51385 1727204584.04886: stdout chunk (state=3): >>> <<< 51385 1727204584.04916: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.04921: stdout chunk (state=3): >>> <<< 51385 1727204584.04981: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.04990: stdout chunk (state=3): >>> <<< 51385 1727204584.05041: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/locale.py<<< 51385 1727204584.05049: stdout chunk (state=3): >>> <<< 51385 1727204584.05077: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05085: stdout chunk (state=3): >>> <<< 51385 1727204584.05136: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05145: stdout chunk (state=3): >>> <<< 51385 1727204584.05208: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05218: stdout chunk (state=3): >>> <<< 51385 1727204584.05347: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05356: stdout chunk (state=3): >>> <<< 51385 1727204584.05453: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 51385 1727204584.05458: stdout chunk (state=3): >>> <<< 51385 1727204584.05505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 51385 1727204584.05513: stdout chunk (state=3): >>> <<< 51385 1727204584.05608: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.05626: stdout chunk (state=3): >>> <<< 51385 1727204584.05645: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.05659: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f5c790> <<< 51385 1727204584.05780: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1422ac0><<< 51385 1727204584.05785: stdout chunk (state=3): >>> <<< 51385 1727204584.05829: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/file.py<<< 51385 1727204584.05841: stdout chunk (state=3): >>> <<< 51385 1727204584.05858: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/process.py<<< 51385 1727204584.05880: stdout chunk (state=3): >>> <<< 51385 1727204584.05885: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05887: stdout chunk (state=3): >>> <<< 51385 1727204584.05978: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.05987: stdout chunk (state=3): >>> <<< 51385 1727204584.06076: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.06081: stdout chunk (state=3): >>> <<< 51385 1727204584.06117: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.06122: stdout chunk (state=3): >>> <<< 51385 1727204584.06186: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py<<< 51385 1727204584.06196: stdout chunk (state=3): >>> <<< 51385 1727204584.06218: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc'<<< 51385 1727204584.06226: stdout chunk (state=3): >>> <<< 51385 1727204584.06261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 51385 1727204584.06273: stdout chunk (state=3): >>> <<< 51385 1727204584.06321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 51385 1727204584.06326: stdout chunk (state=3): >>> <<< 51385 1727204584.06362: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 51385 1727204584.06371: stdout chunk (state=3): >>> <<< 51385 1727204584.06409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc'<<< 51385 1727204584.06417: stdout chunk (state=3): >>> <<< 51385 1727204584.06542: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f6f910><<< 51385 1727204584.06549: stdout chunk (state=3): >>> <<< 51385 1727204584.06613: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fba970><<< 51385 1727204584.06619: stdout chunk (state=3): >>> <<< 51385 1727204584.06704: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa2850><<< 51385 1727204584.06718: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro<<< 51385 1727204584.06740: stdout chunk (state=3): >>> <<< 51385 1727204584.06743: stdout chunk (state=3): >>>import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py<<< 51385 1727204584.06750: stdout chunk (state=3): >>> <<< 51385 1727204584.06767: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.06773: stdout chunk (state=3): >>> <<< 51385 1727204584.06808: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.06814: stdout chunk (state=3): >>> <<< 51385 1727204584.06859: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py<<< 51385 1727204584.06874: stdout chunk (state=3): >>> import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py<<< 51385 1727204584.06879: stdout chunk (state=3): >>> <<< 51385 1727204584.06982: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/basic.py<<< 51385 1727204584.06988: stdout chunk (state=3): >>> <<< 51385 1727204584.07009: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07017: stdout chunk (state=3): >>> <<< 51385 1727204584.07038: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07048: stdout chunk (state=3): >>> <<< 51385 1727204584.07073: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/modules/__init__.py<<< 51385 1727204584.07076: stdout chunk (state=3): >>> <<< 51385 1727204584.07098: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07103: stdout chunk (state=3): >>> <<< 51385 1727204584.07190: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07195: stdout chunk (state=3): >>> <<< 51385 1727204584.07284: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07291: stdout chunk (state=3): >>> <<< 51385 1727204584.07316: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07321: stdout chunk (state=3): >>> <<< 51385 1727204584.07363: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07367: stdout chunk (state=3): >>> <<< 51385 1727204584.07419: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07425: stdout chunk (state=3): >>> <<< 51385 1727204584.07489: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07494: stdout chunk (state=3): >>> <<< 51385 1727204584.07543: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07549: stdout chunk (state=3): >>> <<< 51385 1727204584.07597: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py<<< 51385 1727204584.07603: stdout chunk (state=3): >>> <<< 51385 1727204584.07628: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.07736: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07740: stdout chunk (state=3): >>> <<< 51385 1727204584.07847: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07853: stdout chunk (state=3): >>> <<< 51385 1727204584.07884: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07893: stdout chunk (state=3): >>> <<< 51385 1727204584.07937: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py<<< 51385 1727204584.07948: stdout chunk (state=3): >>> <<< 51385 1727204584.07967: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.07972: stdout chunk (state=3): >>> <<< 51385 1727204584.08199: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.08205: stdout chunk (state=3): >>> <<< 51385 1727204584.08433: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.08439: stdout chunk (state=3): >>> <<< 51385 1727204584.08496: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.08501: stdout chunk (state=3): >>> <<< 51385 1727204584.08576: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py<<< 51385 1727204584.08594: stdout chunk (state=3): >>> <<< 51385 1727204584.08599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204584.08630: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py<<< 51385 1727204584.08635: stdout chunk (state=3): >>> <<< 51385 1727204584.08659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc'<<< 51385 1727204584.08666: stdout chunk (state=3): >>> <<< 51385 1727204584.08696: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py<<< 51385 1727204584.08707: stdout chunk (state=3): >>> <<< 51385 1727204584.08723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc'<<< 51385 1727204584.08728: stdout chunk (state=3): >>> <<< 51385 1727204584.08769: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e130dc70><<< 51385 1727204584.08774: stdout chunk (state=3): >>> <<< 51385 1727204584.08811: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py<<< 51385 1727204584.08817: stdout chunk (state=3): >>> <<< 51385 1727204584.08833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc'<<< 51385 1727204584.08838: stdout chunk (state=3): >>> <<< 51385 1727204584.08873: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py<<< 51385 1727204584.08878: stdout chunk (state=3): >>> <<< 51385 1727204584.08926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc'<<< 51385 1727204584.08930: stdout chunk (state=3): >>> <<< 51385 1727204584.08961: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py<<< 51385 1727204584.08976: stdout chunk (state=3): >>> <<< 51385 1727204584.08993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc'<<< 51385 1727204584.09001: stdout chunk (state=3): >>> <<< 51385 1727204584.09018: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15735e0><<< 51385 1727204584.09027: stdout chunk (state=3): >>> <<< 51385 1727204584.09082: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.09100: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.09116: stdout chunk (state=3): >>> <<< 51385 1727204584.09120: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1573670><<< 51385 1727204584.09123: stdout chunk (state=3): >>> <<< 51385 1727204584.09213: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e158f4f0><<< 51385 1727204584.09220: stdout chunk (state=3): >>> <<< 51385 1727204584.09253: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e158fac0><<< 51385 1727204584.09257: stdout chunk (state=3): >>> <<< 51385 1727204584.09296: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a32e0><<< 51385 1727204584.09302: stdout chunk (state=3): >>> <<< 51385 1727204584.09325: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a3970><<< 51385 1727204584.09331: stdout chunk (state=3): >>> <<< 51385 1727204584.09366: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py<<< 51385 1727204584.09369: stdout chunk (state=3): >>> <<< 51385 1727204584.09397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc'<<< 51385 1727204584.09402: stdout chunk (state=3): >>> <<< 51385 1727204584.09431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py<<< 51385 1727204584.09436: stdout chunk (state=3): >>> <<< 51385 1727204584.09463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc'<<< 51385 1727204584.09469: stdout chunk (state=3): >>> <<< 51385 1727204584.09503: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.09526: stdout chunk (state=3): >>> <<< 51385 1727204584.09531: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.09533: stdout chunk (state=3): >>> import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e15542b0><<< 51385 1727204584.09535: stdout chunk (state=3): >>> <<< 51385 1727204584.09568: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1554a00><<< 51385 1727204584.09571: stdout chunk (state=3): >>> <<< 51385 1727204584.09599: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py<<< 51385 1727204584.09604: stdout chunk (state=3): >>> <<< 51385 1727204584.09628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc'<<< 51385 1727204584.09634: stdout chunk (state=3): >>> <<< 51385 1727204584.09679: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1554940><<< 51385 1727204584.09683: stdout chunk (state=3): >>> <<< 51385 1727204584.09711: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py<<< 51385 1727204584.09716: stdout chunk (state=3): >>> <<< 51385 1727204584.09753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc'<<< 51385 1727204584.09758: stdout chunk (state=3): >>> <<< 51385 1727204584.09811: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.09828: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so'<<< 51385 1727204584.09832: stdout chunk (state=3): >>> <<< 51385 1727204584.09835: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e136f0d0><<< 51385 1727204584.09837: stdout chunk (state=3): >>> <<< 51385 1727204584.09888: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f5b3a0><<< 51385 1727204584.09899: stdout chunk (state=3): >>> <<< 51385 1727204584.09925: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a3670><<< 51385 1727204584.09934: stdout chunk (state=3): >>> <<< 51385 1727204584.09949: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py<<< 51385 1727204584.09953: stdout chunk (state=3): >>> <<< 51385 1727204584.09980: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py<<< 51385 1727204584.09986: stdout chunk (state=3): >>> <<< 51385 1727204584.10012: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.10024: stdout chunk (state=3): >>> <<< 51385 1727204584.10039: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.10056: stdout chunk (state=3): >>> <<< 51385 1727204584.10062: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py<<< 51385 1727204584.10067: stdout chunk (state=3): >>> <<< 51385 1727204584.10068: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10121: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10358: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 51385 1727204584.10365: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10388: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 51385 1727204584.10400: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10457: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10511: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 51385 1727204584.10516: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10573: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10621: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 51385 1727204584.10626: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10703: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10775: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10839: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.10913: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 51385 1727204584.10917: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 51385 1727204584.10922: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.11567: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12160: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 51385 1727204584.12230: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12298: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12334: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12380: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 51385 1727204584.12384: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 51385 1727204584.12421: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12451: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 51385 1727204584.12467: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12528: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12599: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 51385 1727204584.12602: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12639: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12669: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 51385 1727204584.12685: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12712: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12753: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 51385 1727204584.12756: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12859: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.12952: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 51385 1727204584.12957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 51385 1727204584.12985: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1260eb0> <<< 51385 1727204584.13009: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 51385 1727204584.13045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 51385 1727204584.13294: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e12609d0> <<< 51385 1727204584.13297: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 51385 1727204584.13385: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13469: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 51385 1727204584.13475: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13585: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13698: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 51385 1727204584.13703: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13789: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13888: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 51385 1727204584.13895: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13935: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.13998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 51385 1727204584.14021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 51385 1727204584.14224: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e12cfbb0> <<< 51385 1727204584.14625: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1286a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 51385 1727204584.14629: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.14696: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.14766: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 51385 1727204584.14875: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.14977: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15113: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15313: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 51385 1727204584.15317: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 51385 1727204584.15373: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15420: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 51385 1727204584.15427: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15474: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 51385 1727204584.15590: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e12d3040> <<< 51385 1727204584.15608: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e12d36d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 51385 1727204584.15618: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 51385 1727204584.15639: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15684: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15733: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 51385 1727204584.15737: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.15935: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16135: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 51385 1727204584.16138: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16265: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16382: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16425: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16484: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 51385 1727204584.16493: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16587: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16613: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16787: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.16971: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 51385 1727204584.16980: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 51385 1727204584.17130: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.17287: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 51385 1727204584.17293: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.17334: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.17376: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.18080: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.18763: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 51385 1727204584.18879: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19025: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 51385 1727204584.19151: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19333: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 51385 1727204584.19508: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19705: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 51385 1727204584.19711: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19736: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 51385 1727204584.19750: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19762: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19806: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 51385 1727204584.19816: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.19936: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20066: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20337: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20604: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 51385 1727204584.20607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 51385 1727204584.20651: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20693: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 51385 1727204584.20711: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20722: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20758: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 51385 1727204584.20851: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20943: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 51385 1727204584.20954: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.20982: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.21000: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 51385 1727204584.21077: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.21154: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 51385 1727204584.21157: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.21217: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.21302: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 51385 1727204584.21644: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.21984: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 51385 1727204584.21990: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22059: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22124: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 51385 1727204584.22130: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22180: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22210: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 51385 1727204584.22224: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22257: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22298: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 51385 1727204584.22303: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22344: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22379: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 51385 1727204584.22393: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22490: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22590: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 51385 1727204584.22606: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22613: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 51385 1727204584.22634: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22687: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22740: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 51385 1727204584.22743: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22772: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22791: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22854: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.22915: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23004: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23093: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 51385 1727204584.23108: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 51385 1727204584.23116: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23180: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23243: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 51385 1727204584.23251: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23511: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23777: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 51385 1727204584.23784: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23825: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.23884: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 51385 1727204584.23947: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.24017: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 51385 1727204584.24375: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available <<< 51385 1727204584.24435: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 51385 1727204584.24529: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.24794: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 51385 1727204584.24827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 51385 1727204584.24880: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1255a30> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e11ff040> <<< 51385 1727204584.24967: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e11ffc40> <<< 51385 1727204584.26409: stdout chunk (state=3): >>>import 'gc' # <<< 51385 1727204584.26963: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": <<< 51385 1727204584.26998: stdout chunk (state=3): >>>"RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "04", "epoch": "1727204584", "epoch_int": "1727204584", "date": "2024-09-24", "time": "15:03:04", "iso8601_micro": "2024-09-24T19:03:04.261613Z", "iso8601": "2024-09-24T19:03:04Z", "iso8601_basic": "20240924T150304261613", "iso8601_basic_short": "20240924T150304", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 51385 1727204584.27677: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 51385 1727204584.27748: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random <<< 51385 1727204584.27906: stdout chunk (state=3): >>># destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 51385 1727204584.28021: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys<<< 51385 1727204584.28095: stdout chunk (state=3): >>> # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local <<< 51385 1727204584.28119: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 51385 1727204584.28474: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 51385 1727204584.28506: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 51385 1727204584.28533: stdout chunk (state=3): >>># destroy zipimport <<< 51385 1727204584.28550: stdout chunk (state=3): >>># destroy _compression <<< 51385 1727204584.28618: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 51385 1727204584.28626: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 51385 1727204584.28662: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 51385 1727204584.28711: stdout chunk (state=3): >>># destroy selinux <<< 51385 1727204584.28714: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 51385 1727204584.28815: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 51385 1727204584.28819: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 51385 1727204584.28824: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 51385 1727204584.28828: stdout chunk (state=3): >>># destroy queue <<< 51385 1727204584.28831: stdout chunk (state=3): >>># destroy multiprocessing.process <<< 51385 1727204584.28854: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 51385 1727204584.28857: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 51385 1727204584.28893: stdout chunk (state=3): >>># destroy base64 <<< 51385 1727204584.28896: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 51385 1727204584.28902: stdout chunk (state=3): >>># destroy json <<< 51385 1727204584.28932: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 51385 1727204584.28935: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 51385 1727204584.29071: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 51385 1727204584.29173: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 51385 1727204584.29234: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 51385 1727204584.29288: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 51385 1727204584.29330: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 51385 1727204584.29346: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 51385 1727204584.29594: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 51385 1727204584.29648: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 51385 1727204584.29682: stdout chunk (state=3): >>># destroy _frozen_importlib_external<<< 51385 1727204584.29704: stdout chunk (state=3): >>> # destroy _imp # destroy io # destroy marshal <<< 51385 1727204584.29720: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 51385 1727204584.30160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204584.30241: stderr chunk (state=3): >>><<< 51385 1727204584.30246: stdout chunk (state=3): >>><<< 51385 1727204584.30457: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2698ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2655880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e263d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2393f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23990a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e238c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23946a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23933d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e227ae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227a940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227af40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e227ad90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228b100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e236edc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23676a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e237a700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e239ae50> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e228bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e236e2e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e237a310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23a0a00> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228bee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228be20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228bd90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e225e400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e225e4f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2293f70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228dac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2192250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2249550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228df40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e23a00a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21a4b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e21a4eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b67c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b6d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e214e430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21a4fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e215f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e21b6640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e215f3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228ba60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217b730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217b7f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217b8e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e217bd30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e2185280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217b970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e216eac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e228b640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e217bb20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f87e20a9700> # zipimport: found 103 names in '/tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fe6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fe6580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f7b0a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1989370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1989070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1989cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fcedc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fce3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fcef40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e201df40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe3af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fa4550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa4580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19f4fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202f280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19f1820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202f400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e202fc40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f17c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1fc71c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e202f9d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e202f550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e2028940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19e7910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f3edc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f0550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e19e7eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e19f0970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f3b7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f788b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1591940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa3730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fe92e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1f5c790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1422ac0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f6f910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fba970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1fa2850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e130dc70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15735e0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1573670> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e158f4f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e158fac0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a32e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a3970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e15542b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1554a00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1554940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e136f0d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1f5b3a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e15a3670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1260eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e12609d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e12cfbb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e1286a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e12d3040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e12d36d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_l7irmkkh/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87e1255a30> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e11ff040> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87e11ffc40> import 'gc' # {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "04", "epoch": "1727204584", "epoch_int": "1727204584", "date": "2024-09-24", "time": "15:03:04", "iso8601_micro": "2024-09-24T19:03:04.261613Z", "iso8601": "2024-09-24T19:03:04Z", "iso8601_basic": "20240924T150304261613", "iso8601_basic_short": "20240924T150304", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 51385 1727204584.31702: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204584.31705: _low_level_execute_command(): starting 51385 1727204584.31708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204583.6919074-51577-84965345317622/ > /dev/null 2>&1 && sleep 0' 51385 1727204584.32362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204584.32380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.32393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.32409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.32488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.32567: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204584.32583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.32599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204584.32610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204584.32619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204584.32629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.32641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.32658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.32679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.32692: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204584.32706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.32907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.32931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204584.32950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.33050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.35590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204584.35672: stderr chunk (state=3): >>><<< 51385 1727204584.35675: stdout chunk (state=3): >>><<< 51385 1727204584.35772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204584.35775: handler run complete 51385 1727204584.35778: variable 'ansible_facts' from source: unknown 51385 1727204584.35881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204584.35918: variable 'ansible_facts' from source: unknown 51385 1727204584.35974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204584.36041: attempt loop complete, returning result 51385 1727204584.36050: _execute() done 51385 1727204584.36056: dumping result to json 51385 1727204584.36075: done dumping result, returning 51385 1727204584.36088: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-6b1f-5706-0000000000c0] 51385 1727204584.36103: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c0 ok: [managed-node1] 51385 1727204584.36387: no more pending results, returning what we have 51385 1727204584.36390: results queue empty 51385 1727204584.36390: checking for any_errors_fatal 51385 1727204584.36392: done checking for any_errors_fatal 51385 1727204584.36392: checking for max_fail_percentage 51385 1727204584.36394: done checking for max_fail_percentage 51385 1727204584.36395: checking to see if all hosts have failed and the running result is not ok 51385 1727204584.36395: done checking to see if all hosts have failed 51385 1727204584.36396: getting the remaining hosts for this loop 51385 1727204584.36397: done getting the remaining hosts for this loop 51385 1727204584.36401: getting the next task for host managed-node1 51385 1727204584.36410: done getting next task for host managed-node1 51385 1727204584.36413: ^ task is: TASK: Check if system is ostree 51385 1727204584.36415: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204584.36418: getting variables 51385 1727204584.36420: in VariableManager get_vars() 51385 1727204584.36446: Calling all_inventory to load vars for managed-node1 51385 1727204584.36449: Calling groups_inventory to load vars for managed-node1 51385 1727204584.36452: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204584.36467: Calling all_plugins_play to load vars for managed-node1 51385 1727204584.36469: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204584.36473: Calling groups_plugins_play to load vars for managed-node1 51385 1727204584.36660: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c0 51385 1727204584.36666: WORKER PROCESS EXITING 51385 1727204584.36678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204584.36856: done with get_vars() 51385 1727204584.36872: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:03:04 -0400 (0:00:00.774) 0:00:02.773 ***** 51385 1727204584.36989: entering _queue_task() for managed-node1/stat 51385 1727204584.37289: worker is 1 (out of 1 available) 51385 1727204584.37301: exiting _queue_task() for managed-node1/stat 51385 1727204584.37317: done queuing things up, now waiting for results queue to drain 51385 1727204584.37319: waiting for pending results... 51385 1727204584.37604: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 51385 1727204584.37706: in run() - task 0affcd87-79f5-6b1f-5706-0000000000c2 51385 1727204584.37720: variable 'ansible_search_path' from source: unknown 51385 1727204584.37723: variable 'ansible_search_path' from source: unknown 51385 1727204584.37769: calling self._execute() 51385 1727204584.37838: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204584.37842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204584.37853: variable 'omit' from source: magic vars 51385 1727204584.38383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204584.38660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204584.38706: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204584.38748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204584.38792: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204584.38918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204584.38944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204584.38987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204584.39009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204584.39143: Evaluated conditional (not __network_is_ostree is defined): True 51385 1727204584.39151: variable 'omit' from source: magic vars 51385 1727204584.39198: variable 'omit' from source: magic vars 51385 1727204584.39236: variable 'omit' from source: magic vars 51385 1727204584.39261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204584.39303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204584.39319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204584.39336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204584.39346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204584.39380: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204584.39384: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204584.39393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204584.39501: Set connection var ansible_pipelining to False 51385 1727204584.39508: Set connection var ansible_shell_type to sh 51385 1727204584.39522: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204584.39528: Set connection var ansible_timeout to 10 51385 1727204584.39530: Set connection var ansible_connection to ssh 51385 1727204584.39536: Set connection var ansible_shell_executable to /bin/sh 51385 1727204584.39560: variable 'ansible_shell_executable' from source: unknown 51385 1727204584.39567: variable 'ansible_connection' from source: unknown 51385 1727204584.39570: variable 'ansible_module_compression' from source: unknown 51385 1727204584.39573: variable 'ansible_shell_type' from source: unknown 51385 1727204584.39577: variable 'ansible_shell_executable' from source: unknown 51385 1727204584.39579: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204584.39584: variable 'ansible_pipelining' from source: unknown 51385 1727204584.39586: variable 'ansible_timeout' from source: unknown 51385 1727204584.39591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204584.39758: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204584.39773: variable 'omit' from source: magic vars 51385 1727204584.39776: starting attempt loop 51385 1727204584.39779: running the handler 51385 1727204584.39794: _low_level_execute_command(): starting 51385 1727204584.39801: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204584.40569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204584.40589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.40598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.40619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.40657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.40671: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204584.40681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.40694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204584.40707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204584.40715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204584.40724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.40735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.40752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.40760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.40773: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204584.40782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.40857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.40887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204584.40905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.41002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.43112: stdout chunk (state=3): >>>/root <<< 51385 1727204584.43368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204584.43372: stdout chunk (state=3): >>><<< 51385 1727204584.43376: stderr chunk (state=3): >>><<< 51385 1727204584.43502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204584.43513: _low_level_execute_command(): starting 51385 1727204584.43517: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452 `" && echo ansible-tmp-1727204584.4339824-51610-127958764979452="` echo /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452 `" ) && sleep 0' 51385 1727204584.44150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204584.44177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.44193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.44213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.44254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.44281: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204584.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.44316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204584.44329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204584.44341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204584.44354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.44374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.44400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.44414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.44426: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204584.44442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.44531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.44554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204584.44578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.44678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.47208: stdout chunk (state=3): >>>ansible-tmp-1727204584.4339824-51610-127958764979452=/root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452 <<< 51385 1727204584.47472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204584.47476: stdout chunk (state=3): >>><<< 51385 1727204584.47478: stderr chunk (state=3): >>><<< 51385 1727204584.47677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204584.4339824-51610-127958764979452=/root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204584.47681: variable 'ansible_module_compression' from source: unknown 51385 1727204584.47683: ANSIBALLZ: Using lock for stat 51385 1727204584.47685: ANSIBALLZ: Acquiring lock 51385 1727204584.47688: ANSIBALLZ: Lock acquired: 140124836232224 51385 1727204584.47690: ANSIBALLZ: Creating module 51385 1727204584.67785: ANSIBALLZ: Writing module into payload 51385 1727204584.68176: ANSIBALLZ: Writing module 51385 1727204584.68209: ANSIBALLZ: Renaming module 51385 1727204584.68219: ANSIBALLZ: Done creating module 51385 1727204584.68242: variable 'ansible_facts' from source: unknown 51385 1727204584.68324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/AnsiballZ_stat.py 51385 1727204584.69192: Sending initial data 51385 1727204584.69196: Sent initial data (153 bytes) 51385 1727204584.71864: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.71869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.72024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.72029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204584.72032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.72085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.72252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204584.72255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.72431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.74910: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204584.74971: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204584.75208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmp0yw4mo8p /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/AnsiballZ_stat.py <<< 51385 1727204584.75212: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204584.76852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204584.76856: stderr chunk (state=3): >>><<< 51385 1727204584.76863: stdout chunk (state=3): >>><<< 51385 1727204584.76887: done transferring module to remote 51385 1727204584.76902: _low_level_execute_command(): starting 51385 1727204584.76905: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/ /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/AnsiballZ_stat.py && sleep 0' 51385 1727204584.78068: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.78081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.78113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204584.78116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204584.78130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204584.78135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204584.78149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.78226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.78269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.78327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.80858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204584.80870: stderr chunk (state=3): >>><<< 51385 1727204584.80873: stdout chunk (state=3): >>><<< 51385 1727204584.80895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204584.80899: _low_level_execute_command(): starting 51385 1727204584.80903: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/AnsiballZ_stat.py && sleep 0' 51385 1727204584.81849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204584.81853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.81899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.81903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204584.81937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204584.81941: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204584.81943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204584.82005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204584.82008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204584.82023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204584.82101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204584.84957: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 51385 1727204584.84969: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 51385 1727204584.85036: stdout chunk (state=3): >>>import '_io' # <<< 51385 1727204584.85041: stdout chunk (state=3): >>>import 'marshal' # <<< 51385 1727204584.85137: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 51385 1727204584.85200: stdout chunk (state=3): >>>import 'time' # <<< 51385 1727204584.85206: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 51385 1727204584.85309: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 51385 1727204584.85381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 51385 1727204584.85392: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3dc0> <<< 51385 1727204584.85511: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3ac0> <<< 51385 1727204584.85544: stdout chunk (state=3): >>>import '_signal' # <<< 51385 1727204584.85608: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 51385 1727204584.85654: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98490> <<< 51385 1727204584.85665: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 51385 1727204584.85680: stdout chunk (state=3): >>>import '_abc' # <<< 51385 1727204584.85694: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98940> <<< 51385 1727204584.85758: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 51385 1727204584.85767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 51385 1727204584.85793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 51385 1727204584.85814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 51385 1727204584.85841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 51385 1727204584.85866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 51385 1727204584.85894: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f190> <<< 51385 1727204584.85923: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 51385 1727204584.85947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 51385 1727204584.86056: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f220> <<< 51385 1727204584.86091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 51385 1727204584.86100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 51385 1727204584.86137: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 51385 1727204584.86140: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f940> <<< 51385 1727204584.86192: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bb0880> <<< 51385 1727204584.86209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 51385 1727204584.86234: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b48d90> <<< 51385 1727204584.86300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 51385 1727204584.86310: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b72d90> <<< 51385 1727204584.86396: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98970> <<< 51385 1727204584.86433: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 51385 1727204584.86751: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 51385 1727204584.86788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 51385 1727204584.86810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 51385 1727204584.86816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 51385 1727204584.86892: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 51385 1727204584.86904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 51385 1727204584.86938: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aedf10> <<< 51385 1727204584.86986: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0af40a0> <<< 51385 1727204584.87012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 51385 1727204584.87019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 51385 1727204584.87050: stdout chunk (state=3): >>>import '_sre' # <<< 51385 1727204584.87079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 51385 1727204584.87088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 51385 1727204584.87112: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 51385 1727204584.87120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 51385 1727204584.87137: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ae75b0> <<< 51385 1727204584.87171: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aee6a0> <<< 51385 1727204584.87180: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aed3d0> <<< 51385 1727204584.87200: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 51385 1727204584.87281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 51385 1727204584.87311: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 51385 1727204584.87345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204584.87375: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 51385 1727204584.87419: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0a71e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71970> <<< 51385 1727204584.87432: stdout chunk (state=3): >>>import 'itertools' # <<< 51385 1727204584.87463: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 51385 1727204584.87470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71f70> <<< 51385 1727204584.87503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 51385 1727204584.87508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 51385 1727204584.87538: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71dc0> <<< 51385 1727204584.87579: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 51385 1727204584.87582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81130><<< 51385 1727204584.87590: stdout chunk (state=3): >>> <<< 51385 1727204584.87603: stdout chunk (state=3): >>>import '_collections' # <<< 51385 1727204584.87673: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac9df0> <<< 51385 1727204584.87681: stdout chunk (state=3): >>>import '_functools' # <<< 51385 1727204584.87715: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac26d0> <<< 51385 1727204584.87792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 51385 1727204584.87795: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ad5730> <<< 51385 1727204584.87797: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0af5e80> <<< 51385 1727204584.87834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 51385 1727204584.87875: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0a81d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac9310> <<< 51385 1727204584.88273: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0ad5340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0afba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a4c430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 51385 1727204584.88287: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a4c520> <<< 51385 1727204584.88485: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a8afa0> <<< 51385 1727204584.88588: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a84af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 51385 1727204584.88645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 51385 1727204584.88780: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0754280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a37dc0> <<< 51385 1727204584.88992: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a84f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0afb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 51385 1727204584.89012: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0765bb0> import 'errno' # <<< 51385 1727204584.89268: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0765ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f07777f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 51385 1727204584.89318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 51385 1727204584.89412: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0777d30> <<< 51385 1727204584.89446: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0705460> <<< 51385 1727204584.89514: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0765fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 51385 1727204584.89668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0715340><<< 51385 1727204584.89724: stdout chunk (state=3): >>> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0777670> import 'pwd' # <<< 51385 1727204584.89731: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0715400> <<< 51385 1727204584.89788: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81a90> <<< 51385 1727204584.89823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 51385 1727204584.89852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 51385 1727204584.89911: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 51385 1727204584.89918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 51385 1727204584.90395: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.90410: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0731820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 51385 1727204584.90511: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.90517: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731d60> <<< 51385 1727204584.90604: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f073b2b0> <<< 51385 1727204584.90618: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f07319a0> <<< 51385 1727204584.90650: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0725af0><<< 51385 1727204584.90653: stdout chunk (state=3): >>> <<< 51385 1727204584.90691: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81670><<< 51385 1727204584.90696: stdout chunk (state=3): >>> <<< 51385 1727204584.90727: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py<<< 51385 1727204584.90736: stdout chunk (state=3): >>> <<< 51385 1727204584.90815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 51385 1727204584.90868: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0731b50> <<< 51385 1727204584.91011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 51385 1727204584.91039: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc4f0659730> <<< 51385 1727204584.91272: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip' <<< 51385 1727204584.91288: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.91449: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.91495: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/__init__.py<<< 51385 1727204584.91509: stdout chunk (state=3): >>> <<< 51385 1727204584.91516: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.91544: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.91550: stdout chunk (state=3): >>> <<< 51385 1727204584.91576: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/__init__.py<<< 51385 1727204584.91581: stdout chunk (state=3): >>> <<< 51385 1727204584.91604: stdout chunk (state=3): >>># zipimport: zlib available<<< 51385 1727204584.91608: stdout chunk (state=3): >>> <<< 51385 1727204584.93460: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.95168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 51385 1727204584.95174: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0559880> <<< 51385 1727204584.95203: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 51385 1727204584.95207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204584.95239: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 51385 1727204584.95245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 51385 1727204584.95269: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 51385 1727204584.95274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 51385 1727204584.95299: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.95305: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05e86d0> <<< 51385 1727204584.95355: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e85b0> <<< 51385 1727204584.95398: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e82e0> <<< 51385 1727204584.95420: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 51385 1727204584.95425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 51385 1727204584.95490: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e83d0> <<< 51385 1727204584.95493: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8100> <<< 51385 1727204584.95496: stdout chunk (state=3): >>>import 'atexit' # <<< 51385 1727204584.95529: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05e8340> <<< 51385 1727204584.95558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 51385 1727204584.95590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 51385 1727204584.95649: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8730> <<< 51385 1727204584.95669: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 51385 1727204584.95689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 51385 1727204584.95707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 51385 1727204584.95748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 51385 1727204584.95779: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 51385 1727204584.95782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 51385 1727204584.95898: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effae7f0> <<< 51385 1727204584.95937: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4effaebb0> <<< 51385 1727204584.95973: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4effae8b0> <<< 51385 1727204584.96013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 51385 1727204584.96045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 51385 1727204584.96094: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effcbaf0> <<< 51385 1727204584.96129: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e0d60> <<< 51385 1727204584.96396: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8460> <<< 51385 1727204584.96414: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 51385 1727204584.96443: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e0220> <<< 51385 1727204584.96486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 51385 1727204584.96492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 51385 1727204584.96532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 51385 1727204584.96544: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 51385 1727204584.96550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 51385 1727204584.96593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 51385 1727204584.96596: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0554be0> <<< 51385 1727204584.96709: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058af10> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058a8b0> <<< 51385 1727204584.96735: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effc52e0> <<< 51385 1727204584.96752: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f058a9a0> <<< 51385 1727204584.96797: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05b7d00> <<< 51385 1727204584.96821: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 51385 1727204584.96829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 51385 1727204584.96863: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 51385 1727204584.96900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 51385 1727204584.96988: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff90a30> <<< 51385 1727204584.96993: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05bff10> <<< 51385 1727204584.97027: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 51385 1727204584.97036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 51385 1727204584.97126: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9e070> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05bfeb0> <<< 51385 1727204584.97165: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 51385 1727204584.97225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 51385 1727204584.97237: stdout chunk (state=3): >>>import '_string' # <<< 51385 1727204584.97329: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058c790> <<< 51385 1727204584.97542: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff9e0a0> <<< 51385 1727204584.97682: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9b520> <<< 51385 1727204584.97702: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 51385 1727204584.97714: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9b5e0> <<< 51385 1727204584.97752: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9ac10> <<< 51385 1727204584.97784: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0554670> <<< 51385 1727204584.97799: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 51385 1727204584.97816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 51385 1727204584.97838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 51385 1727204584.97880: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0549bb0> <<< 51385 1727204584.98196: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0548970> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff917c0> <<< 51385 1727204584.98240: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05495e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0580b50> <<< 51385 1727204584.98293: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 51385 1727204584.98299: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98401: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98515: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98542: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98553: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 51385 1727204584.98577: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 51385 1727204584.98603: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98747: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.98909: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204584.99689: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.00479: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 51385 1727204585.00483: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 51385 1727204585.00489: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 51385 1727204585.00524: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 51385 1727204585.00532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204585.00597: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff69c70> <<< 51385 1727204585.00696: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 51385 1727204585.00710: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff553d0> <<< 51385 1727204585.00721: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff51d30> <<< 51385 1727204585.00781: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 51385 1727204585.00786: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.00816: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.00822: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 51385 1727204585.00842: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.01027: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.01232: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 51385 1727204585.01280: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff697c0> <<< 51385 1727204585.01285: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.01929: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.02562: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.02640: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.02729: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 51385 1727204585.02749: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.02787: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.02836: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 51385 1727204585.02931: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03030: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 51385 1727204585.03046: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03065: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03079: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 51385 1727204585.03095: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03136: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03181: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 51385 1727204585.03203: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03504: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.03811: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 51385 1727204585.03851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 51385 1727204585.03862: stdout chunk (state=3): >>>import '_ast' # <<< 51385 1727204585.03974: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05832b0> <<< 51385 1727204585.03980: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04065: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04151: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 51385 1727204585.04161: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 51385 1727204585.04174: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 51385 1727204585.04199: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04247: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04302: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 51385 1727204585.04305: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04382: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04423: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04542: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.04628: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 51385 1727204585.04672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 51385 1727204585.04780: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff59ac0> <<< 51385 1727204585.04813: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4ef9cfc70> <<< 51385 1727204585.04852: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 51385 1727204585.04869: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05028: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05109: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05136: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05197: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 51385 1727204585.05212: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 51385 1727204585.05238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 51385 1727204585.05284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 51385 1727204585.05310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 51385 1727204585.05326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 51385 1727204585.05479: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4efb52c10> <<< 51385 1727204585.05530: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff5d3a0> <<< 51385 1727204585.05610: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff69be0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 51385 1727204585.05626: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05655: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05685: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 51385 1727204585.05782: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 51385 1727204585.05787: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05807: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05823: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 51385 1727204585.05829: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.05993: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.06267: stdout chunk (state=3): >>># zipimport: zlib available <<< 51385 1727204585.06439: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 51385 1727204585.06449: stdout chunk (state=3): >>># destroy __main__ <<< 51385 1727204585.06782: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 51385 1727204585.06820: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 51385 1727204585.06907: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 51385 1727204585.06921: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 51385 1727204585.06979: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings<<< 51385 1727204585.07013: stdout chunk (state=3): >>> # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib<<< 51385 1727204585.07047: stdout chunk (state=3): >>> # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil<<< 51385 1727204585.07089: stdout chunk (state=3): >>> # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading<<< 51385 1727204585.07093: stdout chunk (state=3): >>> # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil<<< 51385 1727204585.07101: stdout chunk (state=3): >>> # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random<<< 51385 1727204585.07104: stdout chunk (state=3): >>> # cleanup[2] removing tempfile <<< 51385 1727204585.07106: stdout chunk (state=3): >>># cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 51385 1727204585.07109: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 51385 1727204585.07111: stdout chunk (state=3): >>># cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex <<< 51385 1727204585.07113: stdout chunk (state=3): >>># cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 51385 1727204585.07115: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 51385 1727204585.07117: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 51385 1727204585.07119: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 51385 1727204585.07120: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors <<< 51385 1727204585.07122: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters <<< 51385 1727204585.07123: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters<<< 51385 1727204585.07124: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 51385 1727204585.07125: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 51385 1727204585.07126: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 51385 1727204585.07128: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 51385 1727204585.07342: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 51385 1727204585.07356: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 51385 1727204585.07399: stdout chunk (state=3): >>># destroy zipimport <<< 51385 1727204585.07405: stdout chunk (state=3): >>># destroy _compression <<< 51385 1727204585.07408: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 51385 1727204585.07459: stdout chunk (state=3): >>># destroy __main__ # destroy locale<<< 51385 1727204585.07491: stdout chunk (state=3): >>> # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 51385 1727204585.07495: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array <<< 51385 1727204585.07519: stdout chunk (state=3): >>># destroy datetime <<< 51385 1727204585.07522: stdout chunk (state=3): >>># destroy selinux <<< 51385 1727204585.07525: stdout chunk (state=3): >>># destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 51385 1727204585.07737: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 51385 1727204585.07827: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref<<< 51385 1727204585.07932: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 51385 1727204585.07935: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 51385 1727204585.07938: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 51385 1727204585.07943: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 51385 1727204585.08139: stdout chunk (state=3): >>># destroy platform <<< 51385 1727204585.08153: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse <<< 51385 1727204585.08175: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 51385 1727204585.08178: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 51385 1727204585.08215: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 51385 1727204585.08230: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 51385 1727204585.08252: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 51385 1727204585.08258: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io <<< 51385 1727204585.08268: stdout chunk (state=3): >>># destroy marshal <<< 51385 1727204585.08297: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 51385 1727204585.08694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204585.08757: stderr chunk (state=3): >>><<< 51385 1727204585.08766: stdout chunk (state=3): >>><<< 51385 1727204585.08838: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bf3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0bb0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0b98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aedf10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0af40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ae75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aee6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0aed3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0a71e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a71dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac9df0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac26d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ad5730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0af5e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0a81d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0ac9310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0ad5340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0afba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a4c430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a4c520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a8afa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a84af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0754280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a37dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a84f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0afb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0765bb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0765ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f07777f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0777d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0705460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0765fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0715340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0777670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0715400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0731820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0731d60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f073b2b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f07319a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0725af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0a81670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0731b50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc4f0659730> # zipimport: found 30 names in '/tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0559880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05e86d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e85b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e82e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e83d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8100> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05e8340> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8730> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effae7f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4effaebb0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4effae8b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effcbaf0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e0d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e8460> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05e0220> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0554be0> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058af10> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058a8b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4effc52e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f058a9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05b7d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff90a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05bff10> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9e070> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05bfeb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f058c790> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff9e0a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9b520> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9b5e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff9ac10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0554670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0549bb0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f0548970> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff917c0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4f05495e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f0580b50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff69c70> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff553d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff51d30> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff697c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4f05832b0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc4eff59ac0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4ef9cfc70> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4efb52c10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff5d3a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc4eff69be0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_d0qpp_4a/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 51385 1727204585.09930: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204585.09934: _low_level_execute_command(): starting 51385 1727204585.09936: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204584.4339824-51610-127958764979452/ > /dev/null 2>&1 && sleep 0' 51385 1727204585.09941: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204585.09944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.09945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.09947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.09950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.09952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.09953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.09955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.09956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.09957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.10006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204585.12694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204585.12897: stderr chunk (state=3): >>><<< 51385 1727204585.12910: stdout chunk (state=3): >>><<< 51385 1727204585.13292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204585.13297: handler run complete 51385 1727204585.13299: attempt loop complete, returning result 51385 1727204585.13301: _execute() done 51385 1727204585.13303: dumping result to json 51385 1727204585.13305: done dumping result, returning 51385 1727204585.13307: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [0affcd87-79f5-6b1f-5706-0000000000c2] 51385 1727204585.13309: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c2 51385 1727204585.13481: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c2 51385 1727204585.13486: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 51385 1727204585.13547: no more pending results, returning what we have 51385 1727204585.13550: results queue empty 51385 1727204585.13551: checking for any_errors_fatal 51385 1727204585.13558: done checking for any_errors_fatal 51385 1727204585.13559: checking for max_fail_percentage 51385 1727204585.13560: done checking for max_fail_percentage 51385 1727204585.13561: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.13562: done checking to see if all hosts have failed 51385 1727204585.13563: getting the remaining hosts for this loop 51385 1727204585.13566: done getting the remaining hosts for this loop 51385 1727204585.13570: getting the next task for host managed-node1 51385 1727204585.13575: done getting next task for host managed-node1 51385 1727204585.13578: ^ task is: TASK: Set flag to indicate system is ostree 51385 1727204585.13581: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.13583: getting variables 51385 1727204585.13585: in VariableManager get_vars() 51385 1727204585.13612: Calling all_inventory to load vars for managed-node1 51385 1727204585.13614: Calling groups_inventory to load vars for managed-node1 51385 1727204585.13618: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.13627: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.13630: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.13632: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.13826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.14089: done with get_vars() 51385 1727204585.14131: done getting variables 51385 1727204585.14259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.773) 0:00:03.546 ***** 51385 1727204585.14288: entering _queue_task() for managed-node1/set_fact 51385 1727204585.14289: Creating lock for set_fact 51385 1727204585.14525: worker is 1 (out of 1 available) 51385 1727204585.14538: exiting _queue_task() for managed-node1/set_fact 51385 1727204585.14549: done queuing things up, now waiting for results queue to drain 51385 1727204585.14550: waiting for pending results... 51385 1727204585.14702: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 51385 1727204585.14769: in run() - task 0affcd87-79f5-6b1f-5706-0000000000c3 51385 1727204585.14777: variable 'ansible_search_path' from source: unknown 51385 1727204585.14780: variable 'ansible_search_path' from source: unknown 51385 1727204585.14809: calling self._execute() 51385 1727204585.14867: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.14871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.14878: variable 'omit' from source: magic vars 51385 1727204585.15213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204585.15437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204585.15472: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204585.15498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204585.15523: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204585.15592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204585.15609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204585.15627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204585.15645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204585.15737: Evaluated conditional (not __network_is_ostree is defined): True 51385 1727204585.15741: variable 'omit' from source: magic vars 51385 1727204585.15766: variable 'omit' from source: magic vars 51385 1727204585.15851: variable '__ostree_booted_stat' from source: set_fact 51385 1727204585.15890: variable 'omit' from source: magic vars 51385 1727204585.15912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204585.15933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204585.15947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204585.15967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.15982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.16031: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204585.16039: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.16047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.16155: Set connection var ansible_pipelining to False 51385 1727204585.16168: Set connection var ansible_shell_type to sh 51385 1727204585.16184: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204585.16196: Set connection var ansible_timeout to 10 51385 1727204585.16203: Set connection var ansible_connection to ssh 51385 1727204585.16213: Set connection var ansible_shell_executable to /bin/sh 51385 1727204585.16251: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.16258: variable 'ansible_connection' from source: unknown 51385 1727204585.16271: variable 'ansible_module_compression' from source: unknown 51385 1727204585.16278: variable 'ansible_shell_type' from source: unknown 51385 1727204585.16285: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.16290: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.16296: variable 'ansible_pipelining' from source: unknown 51385 1727204585.16300: variable 'ansible_timeout' from source: unknown 51385 1727204585.16305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.16403: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204585.16416: variable 'omit' from source: magic vars 51385 1727204585.16423: starting attempt loop 51385 1727204585.16429: running the handler 51385 1727204585.16444: handler run complete 51385 1727204585.16472: attempt loop complete, returning result 51385 1727204585.16478: _execute() done 51385 1727204585.16483: dumping result to json 51385 1727204585.16490: done dumping result, returning 51385 1727204585.16497: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [0affcd87-79f5-6b1f-5706-0000000000c3] 51385 1727204585.16505: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c3 ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 51385 1727204585.16906: no more pending results, returning what we have 51385 1727204585.16909: results queue empty 51385 1727204585.16910: checking for any_errors_fatal 51385 1727204585.16915: done checking for any_errors_fatal 51385 1727204585.16916: checking for max_fail_percentage 51385 1727204585.16918: done checking for max_fail_percentage 51385 1727204585.16919: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.16920: done checking to see if all hosts have failed 51385 1727204585.16920: getting the remaining hosts for this loop 51385 1727204585.16922: done getting the remaining hosts for this loop 51385 1727204585.16925: getting the next task for host managed-node1 51385 1727204585.16933: done getting next task for host managed-node1 51385 1727204585.16936: ^ task is: TASK: Fix CentOS6 Base repo 51385 1727204585.16938: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.16941: getting variables 51385 1727204585.16943: in VariableManager get_vars() 51385 1727204585.16972: Calling all_inventory to load vars for managed-node1 51385 1727204585.16975: Calling groups_inventory to load vars for managed-node1 51385 1727204585.16978: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.16989: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.16992: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.16995: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.17204: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c3 51385 1727204585.17213: WORKER PROCESS EXITING 51385 1727204585.17236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.17459: done with get_vars() 51385 1727204585.17474: done getting variables 51385 1727204585.17595: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.033) 0:00:03.580 ***** 51385 1727204585.17633: entering _queue_task() for managed-node1/copy 51385 1727204585.17920: worker is 1 (out of 1 available) 51385 1727204585.17932: exiting _queue_task() for managed-node1/copy 51385 1727204585.17944: done queuing things up, now waiting for results queue to drain 51385 1727204585.17952: waiting for pending results... 51385 1727204585.18226: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 51385 1727204585.18353: in run() - task 0affcd87-79f5-6b1f-5706-0000000000c5 51385 1727204585.18377: variable 'ansible_search_path' from source: unknown 51385 1727204585.18386: variable 'ansible_search_path' from source: unknown 51385 1727204585.18433: calling self._execute() 51385 1727204585.18526: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.18537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.18552: variable 'omit' from source: magic vars 51385 1727204585.19077: variable 'ansible_distribution' from source: facts 51385 1727204585.19103: Evaluated conditional (ansible_distribution == 'CentOS'): True 51385 1727204585.19221: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.19232: Evaluated conditional (ansible_distribution_major_version == '6'): False 51385 1727204585.19239: when evaluation is False, skipping this task 51385 1727204585.19246: _execute() done 51385 1727204585.19252: dumping result to json 51385 1727204585.19262: done dumping result, returning 51385 1727204585.19280: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [0affcd87-79f5-6b1f-5706-0000000000c5] 51385 1727204585.19290: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c5 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 51385 1727204585.19466: no more pending results, returning what we have 51385 1727204585.19470: results queue empty 51385 1727204585.19471: checking for any_errors_fatal 51385 1727204585.19475: done checking for any_errors_fatal 51385 1727204585.19476: checking for max_fail_percentage 51385 1727204585.19478: done checking for max_fail_percentage 51385 1727204585.19479: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.19480: done checking to see if all hosts have failed 51385 1727204585.19480: getting the remaining hosts for this loop 51385 1727204585.19482: done getting the remaining hosts for this loop 51385 1727204585.19486: getting the next task for host managed-node1 51385 1727204585.19494: done getting next task for host managed-node1 51385 1727204585.19496: ^ task is: TASK: Include the task 'enable_epel.yml' 51385 1727204585.19500: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.19505: getting variables 51385 1727204585.19507: in VariableManager get_vars() 51385 1727204585.19536: Calling all_inventory to load vars for managed-node1 51385 1727204585.19539: Calling groups_inventory to load vars for managed-node1 51385 1727204585.19543: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.19556: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.19559: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.19573: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.19795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.20031: done with get_vars() 51385 1727204585.20043: done getting variables 51385 1727204585.20235: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c5 51385 1727204585.20239: WORKER PROCESS EXITING TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.026) 0:00:03.606 ***** 51385 1727204585.20302: entering _queue_task() for managed-node1/include_tasks 51385 1727204585.20732: worker is 1 (out of 1 available) 51385 1727204585.20743: exiting _queue_task() for managed-node1/include_tasks 51385 1727204585.20755: done queuing things up, now waiting for results queue to drain 51385 1727204585.20757: waiting for pending results... 51385 1727204585.21021: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 51385 1727204585.21143: in run() - task 0affcd87-79f5-6b1f-5706-0000000000c6 51385 1727204585.21165: variable 'ansible_search_path' from source: unknown 51385 1727204585.21175: variable 'ansible_search_path' from source: unknown 51385 1727204585.21229: calling self._execute() 51385 1727204585.21323: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.21336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.21350: variable 'omit' from source: magic vars 51385 1727204585.21997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204585.24355: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204585.24418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204585.24446: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204585.24476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204585.24496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204585.24555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204585.24581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204585.24603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204585.24631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204585.24644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204585.24734: variable '__network_is_ostree' from source: set_fact 51385 1727204585.24749: Evaluated conditional (not __network_is_ostree | d(false)): True 51385 1727204585.24758: _execute() done 51385 1727204585.24761: dumping result to json 51385 1727204585.24766: done dumping result, returning 51385 1727204585.24773: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-6b1f-5706-0000000000c6] 51385 1727204585.24778: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c6 51385 1727204585.24891: no more pending results, returning what we have 51385 1727204585.24895: in VariableManager get_vars() 51385 1727204585.24929: Calling all_inventory to load vars for managed-node1 51385 1727204585.24931: Calling groups_inventory to load vars for managed-node1 51385 1727204585.24934: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.24946: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.24949: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.24952: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.25154: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000c6 51385 1727204585.25158: WORKER PROCESS EXITING 51385 1727204585.25172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.25316: done with get_vars() 51385 1727204585.25323: variable 'ansible_search_path' from source: unknown 51385 1727204585.25324: variable 'ansible_search_path' from source: unknown 51385 1727204585.25355: we have included files to process 51385 1727204585.25357: generating all_blocks data 51385 1727204585.25358: done generating all_blocks data 51385 1727204585.25367: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51385 1727204585.25369: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51385 1727204585.25374: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51385 1727204585.26075: done processing included file 51385 1727204585.26078: iterating over new_blocks loaded from include file 51385 1727204585.26079: in VariableManager get_vars() 51385 1727204585.26091: done with get_vars() 51385 1727204585.26093: filtering new block on tags 51385 1727204585.26115: done filtering new block on tags 51385 1727204585.26118: in VariableManager get_vars() 51385 1727204585.26129: done with get_vars() 51385 1727204585.26131: filtering new block on tags 51385 1727204585.26142: done filtering new block on tags 51385 1727204585.26144: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 51385 1727204585.26149: extending task lists for all hosts with included blocks 51385 1727204585.26252: done extending task lists 51385 1727204585.26253: done processing included files 51385 1727204585.26254: results queue empty 51385 1727204585.26255: checking for any_errors_fatal 51385 1727204585.26258: done checking for any_errors_fatal 51385 1727204585.26259: checking for max_fail_percentage 51385 1727204585.26262: done checking for max_fail_percentage 51385 1727204585.26263: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.26265: done checking to see if all hosts have failed 51385 1727204585.26266: getting the remaining hosts for this loop 51385 1727204585.26267: done getting the remaining hosts for this loop 51385 1727204585.26270: getting the next task for host managed-node1 51385 1727204585.26274: done getting next task for host managed-node1 51385 1727204585.26276: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 51385 1727204585.26279: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.26281: getting variables 51385 1727204585.26282: in VariableManager get_vars() 51385 1727204585.26290: Calling all_inventory to load vars for managed-node1 51385 1727204585.26292: Calling groups_inventory to load vars for managed-node1 51385 1727204585.26295: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.26300: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.26308: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.26311: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.26627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.26812: done with get_vars() 51385 1727204585.26818: done getting variables 51385 1727204585.26872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 51385 1727204585.27011: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.067) 0:00:03.674 ***** 51385 1727204585.27044: entering _queue_task() for managed-node1/command 51385 1727204585.27045: Creating lock for command 51385 1727204585.27269: worker is 1 (out of 1 available) 51385 1727204585.27283: exiting _queue_task() for managed-node1/command 51385 1727204585.27294: done queuing things up, now waiting for results queue to drain 51385 1727204585.27296: waiting for pending results... 51385 1727204585.27458: running TaskExecutor() for managed-node1/TASK: Create EPEL 9 51385 1727204585.27529: in run() - task 0affcd87-79f5-6b1f-5706-0000000000e0 51385 1727204585.27538: variable 'ansible_search_path' from source: unknown 51385 1727204585.27541: variable 'ansible_search_path' from source: unknown 51385 1727204585.27577: calling self._execute() 51385 1727204585.27629: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.27634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.27644: variable 'omit' from source: magic vars 51385 1727204585.27913: variable 'ansible_distribution' from source: facts 51385 1727204585.27923: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 51385 1727204585.28011: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.28016: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 51385 1727204585.28019: when evaluation is False, skipping this task 51385 1727204585.28022: _execute() done 51385 1727204585.28025: dumping result to json 51385 1727204585.28027: done dumping result, returning 51385 1727204585.28035: done running TaskExecutor() for managed-node1/TASK: Create EPEL 9 [0affcd87-79f5-6b1f-5706-0000000000e0] 51385 1727204585.28041: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e0 51385 1727204585.28136: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e0 51385 1727204585.28139: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 51385 1727204585.28198: no more pending results, returning what we have 51385 1727204585.28202: results queue empty 51385 1727204585.28203: checking for any_errors_fatal 51385 1727204585.28204: done checking for any_errors_fatal 51385 1727204585.28204: checking for max_fail_percentage 51385 1727204585.28206: done checking for max_fail_percentage 51385 1727204585.28207: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.28207: done checking to see if all hosts have failed 51385 1727204585.28208: getting the remaining hosts for this loop 51385 1727204585.28210: done getting the remaining hosts for this loop 51385 1727204585.28213: getting the next task for host managed-node1 51385 1727204585.28218: done getting next task for host managed-node1 51385 1727204585.28220: ^ task is: TASK: Install yum-utils package 51385 1727204585.28224: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.28227: getting variables 51385 1727204585.28228: in VariableManager get_vars() 51385 1727204585.28263: Calling all_inventory to load vars for managed-node1 51385 1727204585.28267: Calling groups_inventory to load vars for managed-node1 51385 1727204585.28270: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.28279: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.28281: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.28284: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.28425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.28551: done with get_vars() 51385 1727204585.28567: done getting variables 51385 1727204585.28665: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.016) 0:00:03.690 ***** 51385 1727204585.28704: entering _queue_task() for managed-node1/package 51385 1727204585.28706: Creating lock for package 51385 1727204585.28979: worker is 1 (out of 1 available) 51385 1727204585.28994: exiting _queue_task() for managed-node1/package 51385 1727204585.29005: done queuing things up, now waiting for results queue to drain 51385 1727204585.29006: waiting for pending results... 51385 1727204585.29296: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 51385 1727204585.29423: in run() - task 0affcd87-79f5-6b1f-5706-0000000000e1 51385 1727204585.29438: variable 'ansible_search_path' from source: unknown 51385 1727204585.29444: variable 'ansible_search_path' from source: unknown 51385 1727204585.29498: calling self._execute() 51385 1727204585.29585: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.29596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.29611: variable 'omit' from source: magic vars 51385 1727204585.30032: variable 'ansible_distribution' from source: facts 51385 1727204585.30050: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 51385 1727204585.30199: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.30212: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 51385 1727204585.30232: when evaluation is False, skipping this task 51385 1727204585.30240: _execute() done 51385 1727204585.30248: dumping result to json 51385 1727204585.30256: done dumping result, returning 51385 1727204585.30272: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [0affcd87-79f5-6b1f-5706-0000000000e1] 51385 1727204585.30284: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e1 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 51385 1727204585.30441: no more pending results, returning what we have 51385 1727204585.30445: results queue empty 51385 1727204585.30445: checking for any_errors_fatal 51385 1727204585.30453: done checking for any_errors_fatal 51385 1727204585.30454: checking for max_fail_percentage 51385 1727204585.30456: done checking for max_fail_percentage 51385 1727204585.30457: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.30458: done checking to see if all hosts have failed 51385 1727204585.30458: getting the remaining hosts for this loop 51385 1727204585.30462: done getting the remaining hosts for this loop 51385 1727204585.30469: getting the next task for host managed-node1 51385 1727204585.30478: done getting next task for host managed-node1 51385 1727204585.30481: ^ task is: TASK: Enable EPEL 7 51385 1727204585.30485: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.30488: getting variables 51385 1727204585.30490: in VariableManager get_vars() 51385 1727204585.30519: Calling all_inventory to load vars for managed-node1 51385 1727204585.30522: Calling groups_inventory to load vars for managed-node1 51385 1727204585.30526: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.30540: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.30543: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.30547: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.30753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.30919: done with get_vars() 51385 1727204585.30926: done getting variables 51385 1727204585.30959: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e1 51385 1727204585.30966: WORKER PROCESS EXITING 51385 1727204585.30985: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.023) 0:00:03.714 ***** 51385 1727204585.31009: entering _queue_task() for managed-node1/command 51385 1727204585.31184: worker is 1 (out of 1 available) 51385 1727204585.31198: exiting _queue_task() for managed-node1/command 51385 1727204585.31209: done queuing things up, now waiting for results queue to drain 51385 1727204585.31210: waiting for pending results... 51385 1727204585.31363: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 51385 1727204585.31423: in run() - task 0affcd87-79f5-6b1f-5706-0000000000e2 51385 1727204585.31439: variable 'ansible_search_path' from source: unknown 51385 1727204585.31442: variable 'ansible_search_path' from source: unknown 51385 1727204585.31468: calling self._execute() 51385 1727204585.31518: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.31522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.31530: variable 'omit' from source: magic vars 51385 1727204585.31795: variable 'ansible_distribution' from source: facts 51385 1727204585.31805: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 51385 1727204585.31935: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.31939: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 51385 1727204585.31943: when evaluation is False, skipping this task 51385 1727204585.31946: _execute() done 51385 1727204585.31949: dumping result to json 51385 1727204585.31953: done dumping result, returning 51385 1727204585.31958: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [0affcd87-79f5-6b1f-5706-0000000000e2] 51385 1727204585.31967: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e2 51385 1727204585.32051: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e2 51385 1727204585.32054: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 51385 1727204585.32109: no more pending results, returning what we have 51385 1727204585.32112: results queue empty 51385 1727204585.32112: checking for any_errors_fatal 51385 1727204585.32117: done checking for any_errors_fatal 51385 1727204585.32118: checking for max_fail_percentage 51385 1727204585.32119: done checking for max_fail_percentage 51385 1727204585.32120: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.32121: done checking to see if all hosts have failed 51385 1727204585.32122: getting the remaining hosts for this loop 51385 1727204585.32123: done getting the remaining hosts for this loop 51385 1727204585.32126: getting the next task for host managed-node1 51385 1727204585.32131: done getting next task for host managed-node1 51385 1727204585.32133: ^ task is: TASK: Enable EPEL 8 51385 1727204585.32136: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.32139: getting variables 51385 1727204585.32140: in VariableManager get_vars() 51385 1727204585.32166: Calling all_inventory to load vars for managed-node1 51385 1727204585.32168: Calling groups_inventory to load vars for managed-node1 51385 1727204585.32170: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.32177: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.32178: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.32180: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.32384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.32601: done with get_vars() 51385 1727204585.32609: done getting variables 51385 1727204585.32663: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.016) 0:00:03.730 ***** 51385 1727204585.32693: entering _queue_task() for managed-node1/command 51385 1727204585.32922: worker is 1 (out of 1 available) 51385 1727204585.32934: exiting _queue_task() for managed-node1/command 51385 1727204585.32946: done queuing things up, now waiting for results queue to drain 51385 1727204585.32947: waiting for pending results... 51385 1727204585.33203: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 51385 1727204585.33322: in run() - task 0affcd87-79f5-6b1f-5706-0000000000e3 51385 1727204585.33337: variable 'ansible_search_path' from source: unknown 51385 1727204585.33341: variable 'ansible_search_path' from source: unknown 51385 1727204585.33375: calling self._execute() 51385 1727204585.33455: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.33458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.33797: variable 'omit' from source: magic vars 51385 1727204585.33840: variable 'ansible_distribution' from source: facts 51385 1727204585.33844: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 51385 1727204585.33959: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.33969: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 51385 1727204585.33973: when evaluation is False, skipping this task 51385 1727204585.33976: _execute() done 51385 1727204585.33978: dumping result to json 51385 1727204585.33984: done dumping result, returning 51385 1727204585.33987: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [0affcd87-79f5-6b1f-5706-0000000000e3] 51385 1727204585.33996: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e3 51385 1727204585.34083: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e3 51385 1727204585.34086: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 51385 1727204585.34140: no more pending results, returning what we have 51385 1727204585.34143: results queue empty 51385 1727204585.34144: checking for any_errors_fatal 51385 1727204585.34148: done checking for any_errors_fatal 51385 1727204585.34149: checking for max_fail_percentage 51385 1727204585.34150: done checking for max_fail_percentage 51385 1727204585.34151: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.34152: done checking to see if all hosts have failed 51385 1727204585.34153: getting the remaining hosts for this loop 51385 1727204585.34154: done getting the remaining hosts for this loop 51385 1727204585.34157: getting the next task for host managed-node1 51385 1727204585.34166: done getting next task for host managed-node1 51385 1727204585.34168: ^ task is: TASK: Enable EPEL 6 51385 1727204585.34172: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.34175: getting variables 51385 1727204585.34176: in VariableManager get_vars() 51385 1727204585.34199: Calling all_inventory to load vars for managed-node1 51385 1727204585.34201: Calling groups_inventory to load vars for managed-node1 51385 1727204585.34204: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.34213: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.34215: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.34218: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.34409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.34641: done with get_vars() 51385 1727204585.34651: done getting variables 51385 1727204585.34745: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.020) 0:00:03.751 ***** 51385 1727204585.34796: entering _queue_task() for managed-node1/copy 51385 1727204585.35121: worker is 1 (out of 1 available) 51385 1727204585.35134: exiting _queue_task() for managed-node1/copy 51385 1727204585.35144: done queuing things up, now waiting for results queue to drain 51385 1727204585.35146: waiting for pending results... 51385 1727204585.35307: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 51385 1727204585.35376: in run() - task 0affcd87-79f5-6b1f-5706-0000000000e5 51385 1727204585.35386: variable 'ansible_search_path' from source: unknown 51385 1727204585.35389: variable 'ansible_search_path' from source: unknown 51385 1727204585.35418: calling self._execute() 51385 1727204585.35474: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.35483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.35496: variable 'omit' from source: magic vars 51385 1727204585.35924: variable 'ansible_distribution' from source: facts 51385 1727204585.35943: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 51385 1727204585.36072: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.36256: Evaluated conditional (ansible_distribution_major_version == '6'): False 51385 1727204585.36272: when evaluation is False, skipping this task 51385 1727204585.36280: _execute() done 51385 1727204585.36287: dumping result to json 51385 1727204585.36294: done dumping result, returning 51385 1727204585.36304: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [0affcd87-79f5-6b1f-5706-0000000000e5] 51385 1727204585.36314: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e5 51385 1727204585.36422: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000e5 51385 1727204585.36430: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 51385 1727204585.36481: no more pending results, returning what we have 51385 1727204585.36484: results queue empty 51385 1727204585.36485: checking for any_errors_fatal 51385 1727204585.36489: done checking for any_errors_fatal 51385 1727204585.36489: checking for max_fail_percentage 51385 1727204585.36491: done checking for max_fail_percentage 51385 1727204585.36492: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.36493: done checking to see if all hosts have failed 51385 1727204585.36493: getting the remaining hosts for this loop 51385 1727204585.36495: done getting the remaining hosts for this loop 51385 1727204585.36498: getting the next task for host managed-node1 51385 1727204585.36506: done getting next task for host managed-node1 51385 1727204585.36509: ^ task is: TASK: Set network provider to 'nm' 51385 1727204585.36511: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.36514: getting variables 51385 1727204585.36516: in VariableManager get_vars() 51385 1727204585.36540: Calling all_inventory to load vars for managed-node1 51385 1727204585.36543: Calling groups_inventory to load vars for managed-node1 51385 1727204585.36546: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.36558: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.36560: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.36576: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.36937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.37161: done with get_vars() 51385 1727204585.37173: done getting variables 51385 1727204585.37306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.025) 0:00:03.777 ***** 51385 1727204585.37381: entering _queue_task() for managed-node1/set_fact 51385 1727204585.37685: worker is 1 (out of 1 available) 51385 1727204585.37698: exiting _queue_task() for managed-node1/set_fact 51385 1727204585.37708: done queuing things up, now waiting for results queue to drain 51385 1727204585.37710: waiting for pending results... 51385 1727204585.37995: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 51385 1727204585.38120: in run() - task 0affcd87-79f5-6b1f-5706-000000000007 51385 1727204585.38140: variable 'ansible_search_path' from source: unknown 51385 1727204585.38189: calling self._execute() 51385 1727204585.38281: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.38292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.38305: variable 'omit' from source: magic vars 51385 1727204585.38438: variable 'omit' from source: magic vars 51385 1727204585.38493: variable 'omit' from source: magic vars 51385 1727204585.38536: variable 'omit' from source: magic vars 51385 1727204585.38607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204585.38678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204585.38709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204585.38731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.38748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.38853: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204585.38865: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.38873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.39083: Set connection var ansible_pipelining to False 51385 1727204585.39091: Set connection var ansible_shell_type to sh 51385 1727204585.39106: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204585.39129: Set connection var ansible_timeout to 10 51385 1727204585.39152: Set connection var ansible_connection to ssh 51385 1727204585.39168: Set connection var ansible_shell_executable to /bin/sh 51385 1727204585.39193: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.39200: variable 'ansible_connection' from source: unknown 51385 1727204585.39206: variable 'ansible_module_compression' from source: unknown 51385 1727204585.39212: variable 'ansible_shell_type' from source: unknown 51385 1727204585.39217: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.39233: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.39244: variable 'ansible_pipelining' from source: unknown 51385 1727204585.39251: variable 'ansible_timeout' from source: unknown 51385 1727204585.39258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.39420: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204585.39491: variable 'omit' from source: magic vars 51385 1727204585.39501: starting attempt loop 51385 1727204585.39507: running the handler 51385 1727204585.39521: handler run complete 51385 1727204585.39533: attempt loop complete, returning result 51385 1727204585.39539: _execute() done 51385 1727204585.39544: dumping result to json 51385 1727204585.39565: done dumping result, returning 51385 1727204585.39586: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [0affcd87-79f5-6b1f-5706-000000000007] 51385 1727204585.39596: sending task result for task 0affcd87-79f5-6b1f-5706-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 51385 1727204585.39750: no more pending results, returning what we have 51385 1727204585.39753: results queue empty 51385 1727204585.39754: checking for any_errors_fatal 51385 1727204585.39758: done checking for any_errors_fatal 51385 1727204585.39759: checking for max_fail_percentage 51385 1727204585.39765: done checking for max_fail_percentage 51385 1727204585.39766: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.39767: done checking to see if all hosts have failed 51385 1727204585.39768: getting the remaining hosts for this loop 51385 1727204585.39769: done getting the remaining hosts for this loop 51385 1727204585.39773: getting the next task for host managed-node1 51385 1727204585.39781: done getting next task for host managed-node1 51385 1727204585.39783: ^ task is: TASK: meta (flush_handlers) 51385 1727204585.39785: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.39789: getting variables 51385 1727204585.39791: in VariableManager get_vars() 51385 1727204585.39822: Calling all_inventory to load vars for managed-node1 51385 1727204585.39824: Calling groups_inventory to load vars for managed-node1 51385 1727204585.39828: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.39840: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.39842: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.39845: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.40052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.40287: done with get_vars() 51385 1727204585.40306: done getting variables 51385 1727204585.40383: in VariableManager get_vars() 51385 1727204585.40392: Calling all_inventory to load vars for managed-node1 51385 1727204585.40395: Calling groups_inventory to load vars for managed-node1 51385 1727204585.40397: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.40520: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.40524: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.40530: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000007 51385 1727204585.40534: WORKER PROCESS EXITING 51385 1727204585.40537: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.40988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.41708: done with get_vars() 51385 1727204585.41888: done queuing things up, now waiting for results queue to drain 51385 1727204585.41891: results queue empty 51385 1727204585.41891: checking for any_errors_fatal 51385 1727204585.41895: done checking for any_errors_fatal 51385 1727204585.41895: checking for max_fail_percentage 51385 1727204585.41896: done checking for max_fail_percentage 51385 1727204585.41897: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.41898: done checking to see if all hosts have failed 51385 1727204585.41899: getting the remaining hosts for this loop 51385 1727204585.41900: done getting the remaining hosts for this loop 51385 1727204585.41903: getting the next task for host managed-node1 51385 1727204585.41907: done getting next task for host managed-node1 51385 1727204585.41908: ^ task is: TASK: meta (flush_handlers) 51385 1727204585.41910: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.41985: getting variables 51385 1727204585.41987: in VariableManager get_vars() 51385 1727204585.42007: Calling all_inventory to load vars for managed-node1 51385 1727204585.42010: Calling groups_inventory to load vars for managed-node1 51385 1727204585.42012: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.42041: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.42049: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.42053: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.42254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.42631: done with get_vars() 51385 1727204585.42670: done getting variables 51385 1727204585.42980: in VariableManager get_vars() 51385 1727204585.43004: Calling all_inventory to load vars for managed-node1 51385 1727204585.43006: Calling groups_inventory to load vars for managed-node1 51385 1727204585.43009: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.43037: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.43040: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.43062: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.43484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.43772: done with get_vars() 51385 1727204585.43784: done queuing things up, now waiting for results queue to drain 51385 1727204585.43786: results queue empty 51385 1727204585.43787: checking for any_errors_fatal 51385 1727204585.43793: done checking for any_errors_fatal 51385 1727204585.43804: checking for max_fail_percentage 51385 1727204585.43806: done checking for max_fail_percentage 51385 1727204585.43807: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.43807: done checking to see if all hosts have failed 51385 1727204585.43808: getting the remaining hosts for this loop 51385 1727204585.43809: done getting the remaining hosts for this loop 51385 1727204585.43812: getting the next task for host managed-node1 51385 1727204585.43815: done getting next task for host managed-node1 51385 1727204585.43816: ^ task is: None 51385 1727204585.43818: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.43828: done queuing things up, now waiting for results queue to drain 51385 1727204585.43829: results queue empty 51385 1727204585.43830: checking for any_errors_fatal 51385 1727204585.43831: done checking for any_errors_fatal 51385 1727204585.43832: checking for max_fail_percentage 51385 1727204585.43833: done checking for max_fail_percentage 51385 1727204585.43833: checking to see if all hosts have failed and the running result is not ok 51385 1727204585.43834: done checking to see if all hosts have failed 51385 1727204585.43852: getting the next task for host managed-node1 51385 1727204585.43857: done getting next task for host managed-node1 51385 1727204585.43858: ^ task is: None 51385 1727204585.43859: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.43955: in VariableManager get_vars() 51385 1727204585.44028: done with get_vars() 51385 1727204585.44040: in VariableManager get_vars() 51385 1727204585.44089: done with get_vars() 51385 1727204585.44107: variable 'omit' from source: magic vars 51385 1727204585.44176: in VariableManager get_vars() 51385 1727204585.44205: done with get_vars() 51385 1727204585.44271: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 51385 1727204585.44924: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 51385 1727204585.44970: getting the remaining hosts for this loop 51385 1727204585.44971: done getting the remaining hosts for this loop 51385 1727204585.44975: getting the next task for host managed-node1 51385 1727204585.44978: done getting next task for host managed-node1 51385 1727204585.44980: ^ task is: TASK: Gathering Facts 51385 1727204585.44982: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204585.44984: getting variables 51385 1727204585.44985: in VariableManager get_vars() 51385 1727204585.45008: Calling all_inventory to load vars for managed-node1 51385 1727204585.45011: Calling groups_inventory to load vars for managed-node1 51385 1727204585.45013: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204585.45018: Calling all_plugins_play to load vars for managed-node1 51385 1727204585.45043: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204585.45049: Calling groups_plugins_play to load vars for managed-node1 51385 1727204585.45224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204585.45501: done with get_vars() 51385 1727204585.45510: done getting variables 51385 1727204585.45553: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.082) 0:00:03.859 ***** 51385 1727204585.45585: entering _queue_task() for managed-node1/gather_facts 51385 1727204585.45888: worker is 1 (out of 1 available) 51385 1727204585.45905: exiting _queue_task() for managed-node1/gather_facts 51385 1727204585.45918: done queuing things up, now waiting for results queue to drain 51385 1727204585.45920: waiting for pending results... 51385 1727204585.46183: running TaskExecutor() for managed-node1/TASK: Gathering Facts 51385 1727204585.46291: in run() - task 0affcd87-79f5-6b1f-5706-00000000010b 51385 1727204585.46311: variable 'ansible_search_path' from source: unknown 51385 1727204585.46359: calling self._execute() 51385 1727204585.46457: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.46476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.46490: variable 'omit' from source: magic vars 51385 1727204585.46795: variable 'ansible_distribution_major_version' from source: facts 51385 1727204585.46804: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204585.46810: variable 'omit' from source: magic vars 51385 1727204585.46828: variable 'omit' from source: magic vars 51385 1727204585.46853: variable 'omit' from source: magic vars 51385 1727204585.46894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204585.46918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204585.46934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204585.46948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.46957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204585.46985: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204585.46988: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.46991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.47063: Set connection var ansible_pipelining to False 51385 1727204585.47068: Set connection var ansible_shell_type to sh 51385 1727204585.47074: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204585.47081: Set connection var ansible_timeout to 10 51385 1727204585.47083: Set connection var ansible_connection to ssh 51385 1727204585.47088: Set connection var ansible_shell_executable to /bin/sh 51385 1727204585.47104: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.47112: variable 'ansible_connection' from source: unknown 51385 1727204585.47115: variable 'ansible_module_compression' from source: unknown 51385 1727204585.47118: variable 'ansible_shell_type' from source: unknown 51385 1727204585.47122: variable 'ansible_shell_executable' from source: unknown 51385 1727204585.47124: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204585.47126: variable 'ansible_pipelining' from source: unknown 51385 1727204585.47129: variable 'ansible_timeout' from source: unknown 51385 1727204585.47131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204585.47336: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204585.47343: variable 'omit' from source: magic vars 51385 1727204585.47350: starting attempt loop 51385 1727204585.47353: running the handler 51385 1727204585.47367: variable 'ansible_facts' from source: unknown 51385 1727204585.47383: _low_level_execute_command(): starting 51385 1727204585.47389: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204585.48152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204585.48178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.48181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.48184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.48229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.48234: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.48248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.48306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.48320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.48331: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204585.48352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.48437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.48466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.48488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.48710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204585.51098: stdout chunk (state=3): >>>/root <<< 51385 1727204585.51363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204585.51642: stderr chunk (state=3): >>><<< 51385 1727204585.51686: stdout chunk (state=3): >>><<< 51385 1727204585.51807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204585.51822: _low_level_execute_command(): starting 51385 1727204585.51825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803 `" && echo ansible-tmp-1727204585.517247-51674-39707178358803="` echo /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803 `" ) && sleep 0' 51385 1727204585.52658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204585.52709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.52783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.52848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.53050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.53067: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204585.53081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.53103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204585.53114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204585.53126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204585.53140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.53156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.53178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.53191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.53230: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204585.53250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.53339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.53374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.53448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.53574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204585.56178: stdout chunk (state=3): >>>ansible-tmp-1727204585.517247-51674-39707178358803=/root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803 <<< 51385 1727204585.56340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204585.56440: stderr chunk (state=3): >>><<< 51385 1727204585.56450: stdout chunk (state=3): >>><<< 51385 1727204585.56516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204585.517247-51674-39707178358803=/root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204585.56600: variable 'ansible_module_compression' from source: unknown 51385 1727204585.56791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 51385 1727204585.56794: variable 'ansible_facts' from source: unknown 51385 1727204585.57310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/AnsiballZ_setup.py 51385 1727204585.57643: Sending initial data 51385 1727204585.57646: Sent initial data (152 bytes) 51385 1727204585.58651: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204585.58667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.58682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.58698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.58741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.58758: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204585.58778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.58795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204585.58807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204585.58818: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204585.58831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.58847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.58873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.58887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.58898: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204585.58910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.58991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.59007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.59021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.59159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204585.61570: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204585.61599: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204585.61633: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpqbag2zft /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/AnsiballZ_setup.py <<< 51385 1727204585.61698: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204585.64914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204585.65077: stderr chunk (state=3): >>><<< 51385 1727204585.65081: stdout chunk (state=3): >>><<< 51385 1727204585.65083: done transferring module to remote 51385 1727204585.65085: _low_level_execute_command(): starting 51385 1727204585.65087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/ /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/AnsiballZ_setup.py && sleep 0' 51385 1727204585.65830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.65834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.65866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.65871: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.65883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.65956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.65963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.65971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.66040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204585.68509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204585.68590: stderr chunk (state=3): >>><<< 51385 1727204585.68594: stdout chunk (state=3): >>><<< 51385 1727204585.68677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204585.68681: _low_level_execute_command(): starting 51385 1727204585.68684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/AnsiballZ_setup.py && sleep 0' 51385 1727204585.70009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204585.70028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.70045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.70067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.70118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.70130: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204585.70143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.70159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204585.70178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204585.70192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204585.70210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204585.70223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204585.70238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204585.70249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204585.70258: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204585.70277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204585.70363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204585.70389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204585.70404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204585.70509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204586.32491: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "06", "epoch": "1727204586", "epoch_int": "1727204586", "date": "2024-09-24", "time": "15:03:06", "iso8601_micro": "2024-09-24T19:03:06.028584Z", "iso8601": "2024-09-24T19:03:06Z", "iso8601_basic": "20240924T150306028584", "iso8601_basic_short": "20240924T150306", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.43, "5m": 0.43, "15m": 0.29}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root",<<< 51385 1727204586.32515: stdout chunk (state=3): >>> "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineI<<< 51385 1727204586.32553: stdout chunk (state=3): >>>ntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2755, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 777, "free": 2755}, "nocache": {"free": 3232, "used": 300}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 849, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266956800, "block_size": 4096, "block_total": 65519355, "block_available": 64518300, "block_used": 1001055, "inode_total": 131071472, "inode_available": 130998224, "inode_used": 73248, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_<<< 51385 1727204586.32559: stdout chunk (state=3): >>>tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 51385 1727204586.35093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204586.35100: stdout chunk (state=3): >>><<< 51385 1727204586.35102: stderr chunk (state=3): >>><<< 51385 1727204586.35280: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "06", "epoch": "1727204586", "epoch_int": "1727204586", "date": "2024-09-24", "time": "15:03:06", "iso8601_micro": "2024-09-24T19:03:06.028584Z", "iso8601": "2024-09-24T19:03:06Z", "iso8601_basic": "20240924T150306028584", "iso8601_basic_short": "20240924T150306", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.43, "5m": 0.43, "15m": 0.29}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2755, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 777, "free": 2755}, "nocache": {"free": 3232, "used": 300}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 849, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266956800, "block_size": 4096, "block_total": 65519355, "block_available": 64518300, "block_used": 1001055, "inode_total": 131071472, "inode_available": 130998224, "inode_used": 73248, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204586.35599: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204586.35632: _low_level_execute_command(): starting 51385 1727204586.35644: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204585.517247-51674-39707178358803/ > /dev/null 2>&1 && sleep 0' 51385 1727204586.37476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204586.37485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.37495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.37509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.37556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204586.37568: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204586.37575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.37589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204586.37644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204586.37651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204586.37659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.37675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.37689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.37700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204586.37709: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204586.37721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.37803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.37984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204586.37999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.38093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 51385 1727204586.40661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204586.40717: stderr chunk (state=3): >>><<< 51385 1727204586.40720: stdout chunk (state=3): >>><<< 51385 1727204586.40973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 51385 1727204586.40976: handler run complete 51385 1727204586.40978: variable 'ansible_facts' from source: unknown 51385 1727204586.40980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.42030: variable 'ansible_facts' from source: unknown 51385 1727204586.42286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.42708: attempt loop complete, returning result 51385 1727204586.42720: _execute() done 51385 1727204586.42896: dumping result to json 51385 1727204586.42936: done dumping result, returning 51385 1727204586.42950: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-6b1f-5706-00000000010b] 51385 1727204586.43007: sending task result for task 0affcd87-79f5-6b1f-5706-00000000010b ok: [managed-node1] 51385 1727204586.44149: no more pending results, returning what we have 51385 1727204586.44152: results queue empty 51385 1727204586.44153: checking for any_errors_fatal 51385 1727204586.44154: done checking for any_errors_fatal 51385 1727204586.44155: checking for max_fail_percentage 51385 1727204586.44156: done checking for max_fail_percentage 51385 1727204586.44157: checking to see if all hosts have failed and the running result is not ok 51385 1727204586.44158: done checking to see if all hosts have failed 51385 1727204586.44159: getting the remaining hosts for this loop 51385 1727204586.44161: done getting the remaining hosts for this loop 51385 1727204586.44166: getting the next task for host managed-node1 51385 1727204586.44173: done getting next task for host managed-node1 51385 1727204586.44175: ^ task is: TASK: meta (flush_handlers) 51385 1727204586.44177: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204586.44180: getting variables 51385 1727204586.44182: in VariableManager get_vars() 51385 1727204586.44217: Calling all_inventory to load vars for managed-node1 51385 1727204586.44220: Calling groups_inventory to load vars for managed-node1 51385 1727204586.44223: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.44234: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.44236: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.44239: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.44399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.45086: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000010b 51385 1727204586.45090: WORKER PROCESS EXITING 51385 1727204586.45135: done with get_vars() 51385 1727204586.45147: done getting variables 51385 1727204586.45218: in VariableManager get_vars() 51385 1727204586.45233: Calling all_inventory to load vars for managed-node1 51385 1727204586.45235: Calling groups_inventory to load vars for managed-node1 51385 1727204586.45237: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.45243: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.45245: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.45252: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.45585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.46219: done with get_vars() 51385 1727204586.46233: done queuing things up, now waiting for results queue to drain 51385 1727204586.46234: results queue empty 51385 1727204586.46235: checking for any_errors_fatal 51385 1727204586.46238: done checking for any_errors_fatal 51385 1727204586.46239: checking for max_fail_percentage 51385 1727204586.46240: done checking for max_fail_percentage 51385 1727204586.46241: checking to see if all hosts have failed and the running result is not ok 51385 1727204586.46242: done checking to see if all hosts have failed 51385 1727204586.46242: getting the remaining hosts for this loop 51385 1727204586.46243: done getting the remaining hosts for this loop 51385 1727204586.46245: getting the next task for host managed-node1 51385 1727204586.46249: done getting next task for host managed-node1 51385 1727204586.46251: ^ task is: TASK: Include the task 'show_interfaces.yml' 51385 1727204586.46252: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204586.46254: getting variables 51385 1727204586.46255: in VariableManager get_vars() 51385 1727204586.46270: Calling all_inventory to load vars for managed-node1 51385 1727204586.46273: Calling groups_inventory to load vars for managed-node1 51385 1727204586.46274: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.46279: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.46281: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.46283: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.46420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.47409: done with get_vars() 51385 1727204586.47418: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Tuesday 24 September 2024 15:03:06 -0400 (0:00:01.019) 0:00:04.878 ***** 51385 1727204586.47495: entering _queue_task() for managed-node1/include_tasks 51385 1727204586.47798: worker is 1 (out of 1 available) 51385 1727204586.47810: exiting _queue_task() for managed-node1/include_tasks 51385 1727204586.47822: done queuing things up, now waiting for results queue to drain 51385 1727204586.47823: waiting for pending results... 51385 1727204586.48879: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 51385 1727204586.49176: in run() - task 0affcd87-79f5-6b1f-5706-00000000000b 51385 1727204586.49341: variable 'ansible_search_path' from source: unknown 51385 1727204586.49390: calling self._execute() 51385 1727204586.49763: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204586.49802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204586.49815: variable 'omit' from source: magic vars 51385 1727204586.51023: variable 'ansible_distribution_major_version' from source: facts 51385 1727204586.51141: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204586.51153: _execute() done 51385 1727204586.51166: dumping result to json 51385 1727204586.51177: done dumping result, returning 51385 1727204586.51276: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-6b1f-5706-00000000000b] 51385 1727204586.51302: sending task result for task 0affcd87-79f5-6b1f-5706-00000000000b 51385 1727204586.51545: no more pending results, returning what we have 51385 1727204586.51551: in VariableManager get_vars() 51385 1727204586.51608: Calling all_inventory to load vars for managed-node1 51385 1727204586.51611: Calling groups_inventory to load vars for managed-node1 51385 1727204586.51614: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.51628: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.51630: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.51633: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.51829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.52087: done with get_vars() 51385 1727204586.52147: variable 'ansible_search_path' from source: unknown 51385 1727204586.52169: we have included files to process 51385 1727204586.52170: generating all_blocks data 51385 1727204586.52172: done generating all_blocks data 51385 1727204586.52173: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204586.52174: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204586.52177: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204586.52776: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000000b 51385 1727204586.52779: WORKER PROCESS EXITING 51385 1727204586.52785: in VariableManager get_vars() 51385 1727204586.52812: done with get_vars() 51385 1727204586.53232: done processing included file 51385 1727204586.53234: iterating over new_blocks loaded from include file 51385 1727204586.53236: in VariableManager get_vars() 51385 1727204586.53253: done with get_vars() 51385 1727204586.53255: filtering new block on tags 51385 1727204586.53276: done filtering new block on tags 51385 1727204586.53278: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 51385 1727204586.53283: extending task lists for all hosts with included blocks 51385 1727204586.58276: done extending task lists 51385 1727204586.58278: done processing included files 51385 1727204586.58279: results queue empty 51385 1727204586.58280: checking for any_errors_fatal 51385 1727204586.58282: done checking for any_errors_fatal 51385 1727204586.58283: checking for max_fail_percentage 51385 1727204586.58284: done checking for max_fail_percentage 51385 1727204586.58285: checking to see if all hosts have failed and the running result is not ok 51385 1727204586.58285: done checking to see if all hosts have failed 51385 1727204586.58286: getting the remaining hosts for this loop 51385 1727204586.58288: done getting the remaining hosts for this loop 51385 1727204586.58290: getting the next task for host managed-node1 51385 1727204586.58294: done getting next task for host managed-node1 51385 1727204586.58296: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 51385 1727204586.58299: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204586.58301: getting variables 51385 1727204586.58302: in VariableManager get_vars() 51385 1727204586.58320: Calling all_inventory to load vars for managed-node1 51385 1727204586.58322: Calling groups_inventory to load vars for managed-node1 51385 1727204586.58324: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.58331: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.58333: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.58336: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.58575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.58786: done with get_vars() 51385 1727204586.58798: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.113) 0:00:04.992 ***** 51385 1727204586.58875: entering _queue_task() for managed-node1/include_tasks 51385 1727204586.59278: worker is 1 (out of 1 available) 51385 1727204586.59292: exiting _queue_task() for managed-node1/include_tasks 51385 1727204586.59302: done queuing things up, now waiting for results queue to drain 51385 1727204586.59303: waiting for pending results... 51385 1727204586.59730: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 51385 1727204586.59840: in run() - task 0affcd87-79f5-6b1f-5706-000000000120 51385 1727204586.59970: variable 'ansible_search_path' from source: unknown 51385 1727204586.59979: variable 'ansible_search_path' from source: unknown 51385 1727204586.60016: calling self._execute() 51385 1727204586.60104: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204586.60116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204586.60129: variable 'omit' from source: magic vars 51385 1727204586.60593: variable 'ansible_distribution_major_version' from source: facts 51385 1727204586.60613: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204586.60626: _execute() done 51385 1727204586.60634: dumping result to json 51385 1727204586.60642: done dumping result, returning 51385 1727204586.60652: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-6b1f-5706-000000000120] 51385 1727204586.60665: sending task result for task 0affcd87-79f5-6b1f-5706-000000000120 51385 1727204586.60773: no more pending results, returning what we have 51385 1727204586.60779: in VariableManager get_vars() 51385 1727204586.60821: Calling all_inventory to load vars for managed-node1 51385 1727204586.60825: Calling groups_inventory to load vars for managed-node1 51385 1727204586.60827: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.60843: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.60846: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.60848: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.61076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.61291: done with get_vars() 51385 1727204586.61301: variable 'ansible_search_path' from source: unknown 51385 1727204586.61302: variable 'ansible_search_path' from source: unknown 51385 1727204586.61345: we have included files to process 51385 1727204586.61347: generating all_blocks data 51385 1727204586.61349: done generating all_blocks data 51385 1727204586.61351: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204586.61352: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204586.61354: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204586.61873: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000120 51385 1727204586.61877: WORKER PROCESS EXITING 51385 1727204586.62056: done processing included file 51385 1727204586.62058: iterating over new_blocks loaded from include file 51385 1727204586.62059: in VariableManager get_vars() 51385 1727204586.62081: done with get_vars() 51385 1727204586.62083: filtering new block on tags 51385 1727204586.62101: done filtering new block on tags 51385 1727204586.62103: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 51385 1727204586.62108: extending task lists for all hosts with included blocks 51385 1727204586.62212: done extending task lists 51385 1727204586.62214: done processing included files 51385 1727204586.62215: results queue empty 51385 1727204586.62215: checking for any_errors_fatal 51385 1727204586.62220: done checking for any_errors_fatal 51385 1727204586.62221: checking for max_fail_percentage 51385 1727204586.62222: done checking for max_fail_percentage 51385 1727204586.62223: checking to see if all hosts have failed and the running result is not ok 51385 1727204586.62224: done checking to see if all hosts have failed 51385 1727204586.62224: getting the remaining hosts for this loop 51385 1727204586.62226: done getting the remaining hosts for this loop 51385 1727204586.62228: getting the next task for host managed-node1 51385 1727204586.62232: done getting next task for host managed-node1 51385 1727204586.62234: ^ task is: TASK: Gather current interface info 51385 1727204586.62237: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204586.62240: getting variables 51385 1727204586.62241: in VariableManager get_vars() 51385 1727204586.62254: Calling all_inventory to load vars for managed-node1 51385 1727204586.62257: Calling groups_inventory to load vars for managed-node1 51385 1727204586.62259: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204586.62265: Calling all_plugins_play to load vars for managed-node1 51385 1727204586.62268: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204586.62271: Calling groups_plugins_play to load vars for managed-node1 51385 1727204586.62688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204586.63231: done with get_vars() 51385 1727204586.63242: done getting variables 51385 1727204586.63286: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.044) 0:00:05.037 ***** 51385 1727204586.63313: entering _queue_task() for managed-node1/command 51385 1727204586.63614: worker is 1 (out of 1 available) 51385 1727204586.63629: exiting _queue_task() for managed-node1/command 51385 1727204586.63641: done queuing things up, now waiting for results queue to drain 51385 1727204586.63642: waiting for pending results... 51385 1727204586.64017: running TaskExecutor() for managed-node1/TASK: Gather current interface info 51385 1727204586.64226: in run() - task 0affcd87-79f5-6b1f-5706-0000000001ff 51385 1727204586.64245: variable 'ansible_search_path' from source: unknown 51385 1727204586.64253: variable 'ansible_search_path' from source: unknown 51385 1727204586.64294: calling self._execute() 51385 1727204586.64395: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204586.64406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204586.64431: variable 'omit' from source: magic vars 51385 1727204586.64829: variable 'ansible_distribution_major_version' from source: facts 51385 1727204586.64853: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204586.64871: variable 'omit' from source: magic vars 51385 1727204586.64922: variable 'omit' from source: magic vars 51385 1727204586.64968: variable 'omit' from source: magic vars 51385 1727204586.65016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204586.65056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204586.65093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204586.65116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204586.65134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204586.65173: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204586.65188: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204586.65199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204586.65312: Set connection var ansible_pipelining to False 51385 1727204586.65320: Set connection var ansible_shell_type to sh 51385 1727204586.65335: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204586.65347: Set connection var ansible_timeout to 10 51385 1727204586.65355: Set connection var ansible_connection to ssh 51385 1727204586.65367: Set connection var ansible_shell_executable to /bin/sh 51385 1727204586.65396: variable 'ansible_shell_executable' from source: unknown 51385 1727204586.65412: variable 'ansible_connection' from source: unknown 51385 1727204586.65421: variable 'ansible_module_compression' from source: unknown 51385 1727204586.65427: variable 'ansible_shell_type' from source: unknown 51385 1727204586.65434: variable 'ansible_shell_executable' from source: unknown 51385 1727204586.65441: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204586.65448: variable 'ansible_pipelining' from source: unknown 51385 1727204586.65455: variable 'ansible_timeout' from source: unknown 51385 1727204586.65462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204586.65659: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204586.65745: variable 'omit' from source: magic vars 51385 1727204586.65754: starting attempt loop 51385 1727204586.65760: running the handler 51385 1727204586.65779: _low_level_execute_command(): starting 51385 1727204586.65825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204586.66979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.66984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.67018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204586.67021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.67024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204586.67028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.67172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.67176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204586.67178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.67261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204586.68793: stdout chunk (state=3): >>>/root <<< 51385 1727204586.68975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204586.68998: stdout chunk (state=3): >>><<< 51385 1727204586.69001: stderr chunk (state=3): >>><<< 51385 1727204586.69123: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204586.69127: _low_level_execute_command(): starting 51385 1727204586.69130: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511 `" && echo ansible-tmp-1727204586.690246-51750-110448194732511="` echo /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511 `" ) && sleep 0' 51385 1727204586.70617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.70621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.70658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204586.70671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.70676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.70739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.70742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204586.70744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.70896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204586.72646: stdout chunk (state=3): >>>ansible-tmp-1727204586.690246-51750-110448194732511=/root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511 <<< 51385 1727204586.72757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204586.72836: stderr chunk (state=3): >>><<< 51385 1727204586.72841: stdout chunk (state=3): >>><<< 51385 1727204586.73074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204586.690246-51750-110448194732511=/root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204586.73078: variable 'ansible_module_compression' from source: unknown 51385 1727204586.73080: ANSIBALLZ: Using generic lock for ansible.legacy.command 51385 1727204586.73081: ANSIBALLZ: Acquiring lock 51385 1727204586.73083: ANSIBALLZ: Lock acquired: 140124837667440 51385 1727204586.73085: ANSIBALLZ: Creating module 51385 1727204586.91157: ANSIBALLZ: Writing module into payload 51385 1727204586.91278: ANSIBALLZ: Writing module 51385 1727204586.91310: ANSIBALLZ: Renaming module 51385 1727204586.91319: ANSIBALLZ: Done creating module 51385 1727204586.91340: variable 'ansible_facts' from source: unknown 51385 1727204586.91422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/AnsiballZ_command.py 51385 1727204586.91585: Sending initial data 51385 1727204586.91589: Sent initial data (155 bytes) 51385 1727204586.92527: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204586.92542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.92557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.92576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.92615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204586.92628: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204586.92642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.92661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204586.92677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204586.92690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204586.92703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.92717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.92734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.92744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204586.92752: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204586.92763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.92840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.92860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204586.92877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.92962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204586.94666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204586.94710: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204586.94761: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpronxd85e /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/AnsiballZ_command.py <<< 51385 1727204586.94809: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204586.95740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204586.95992: stderr chunk (state=3): >>><<< 51385 1727204586.95996: stdout chunk (state=3): >>><<< 51385 1727204586.95998: done transferring module to remote 51385 1727204586.96000: _low_level_execute_command(): starting 51385 1727204586.96003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/ /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/AnsiballZ_command.py && sleep 0' 51385 1727204586.96650: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.96655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.96693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204586.96700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204586.96702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204586.96704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.96754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.96758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204586.96763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.96813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204586.98545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204586.98603: stderr chunk (state=3): >>><<< 51385 1727204586.98609: stdout chunk (state=3): >>><<< 51385 1727204586.98625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204586.98628: _low_level_execute_command(): starting 51385 1727204586.98633: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/AnsiballZ_command.py && sleep 0' 51385 1727204586.99098: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204586.99102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204586.99122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.99140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204586.99186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204586.99198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204586.99271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.12608: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:07.122589", "end": "2024-09-24 15:03:07.125501", "delta": "0:00:00.002912", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204587.13701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204587.13758: stderr chunk (state=3): >>><<< 51385 1727204587.13767: stdout chunk (state=3): >>><<< 51385 1727204587.13781: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:07.122589", "end": "2024-09-24 15:03:07.125501", "delta": "0:00:00.002912", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204587.13811: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204587.13818: _low_level_execute_command(): starting 51385 1727204587.13823: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204586.690246-51750-110448194732511/ > /dev/null 2>&1 && sleep 0' 51385 1727204587.14299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.14319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.14338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.14349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.14397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.14409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.14475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.16218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.16277: stderr chunk (state=3): >>><<< 51385 1727204587.16280: stdout chunk (state=3): >>><<< 51385 1727204587.16294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.16301: handler run complete 51385 1727204587.16319: Evaluated conditional (False): False 51385 1727204587.16327: attempt loop complete, returning result 51385 1727204587.16330: _execute() done 51385 1727204587.16332: dumping result to json 51385 1727204587.16337: done dumping result, returning 51385 1727204587.16344: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-6b1f-5706-0000000001ff] 51385 1727204587.16349: sending task result for task 0affcd87-79f5-6b1f-5706-0000000001ff 51385 1727204587.16450: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000001ff 51385 1727204587.16453: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002912", "end": "2024-09-24 15:03:07.125501", "rc": 0, "start": "2024-09-24 15:03:07.122589" } STDOUT: bonding_masters eth0 lo rpltstbr 51385 1727204587.16532: no more pending results, returning what we have 51385 1727204587.16536: results queue empty 51385 1727204587.16537: checking for any_errors_fatal 51385 1727204587.16538: done checking for any_errors_fatal 51385 1727204587.16539: checking for max_fail_percentage 51385 1727204587.16540: done checking for max_fail_percentage 51385 1727204587.16541: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.16542: done checking to see if all hosts have failed 51385 1727204587.16543: getting the remaining hosts for this loop 51385 1727204587.16544: done getting the remaining hosts for this loop 51385 1727204587.16547: getting the next task for host managed-node1 51385 1727204587.16556: done getting next task for host managed-node1 51385 1727204587.16559: ^ task is: TASK: Set current_interfaces 51385 1727204587.16563: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.16572: getting variables 51385 1727204587.16574: in VariableManager get_vars() 51385 1727204587.16615: Calling all_inventory to load vars for managed-node1 51385 1727204587.16618: Calling groups_inventory to load vars for managed-node1 51385 1727204587.16620: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.16632: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.16634: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.16636: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.16801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.16934: done with get_vars() 51385 1727204587.16943: done getting variables 51385 1727204587.16989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.536) 0:00:05.574 ***** 51385 1727204587.17010: entering _queue_task() for managed-node1/set_fact 51385 1727204587.17212: worker is 1 (out of 1 available) 51385 1727204587.17225: exiting _queue_task() for managed-node1/set_fact 51385 1727204587.17237: done queuing things up, now waiting for results queue to drain 51385 1727204587.17240: waiting for pending results... 51385 1727204587.17396: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 51385 1727204587.17465: in run() - task 0affcd87-79f5-6b1f-5706-000000000200 51385 1727204587.17479: variable 'ansible_search_path' from source: unknown 51385 1727204587.17483: variable 'ansible_search_path' from source: unknown 51385 1727204587.17513: calling self._execute() 51385 1727204587.17581: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.17589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.17599: variable 'omit' from source: magic vars 51385 1727204587.17878: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.17889: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.17894: variable 'omit' from source: magic vars 51385 1727204587.17927: variable 'omit' from source: magic vars 51385 1727204587.18003: variable '_current_interfaces' from source: set_fact 51385 1727204587.18108: variable 'omit' from source: magic vars 51385 1727204587.18140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.18168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.18186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.18199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.18208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.18232: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.18235: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.18237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.18308: Set connection var ansible_pipelining to False 51385 1727204587.18311: Set connection var ansible_shell_type to sh 51385 1727204587.18319: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.18325: Set connection var ansible_timeout to 10 51385 1727204587.18328: Set connection var ansible_connection to ssh 51385 1727204587.18334: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.18351: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.18354: variable 'ansible_connection' from source: unknown 51385 1727204587.18356: variable 'ansible_module_compression' from source: unknown 51385 1727204587.18358: variable 'ansible_shell_type' from source: unknown 51385 1727204587.18363: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.18367: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.18369: variable 'ansible_pipelining' from source: unknown 51385 1727204587.18372: variable 'ansible_timeout' from source: unknown 51385 1727204587.18374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.18472: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.18481: variable 'omit' from source: magic vars 51385 1727204587.18486: starting attempt loop 51385 1727204587.18489: running the handler 51385 1727204587.18499: handler run complete 51385 1727204587.18509: attempt loop complete, returning result 51385 1727204587.18511: _execute() done 51385 1727204587.18514: dumping result to json 51385 1727204587.18516: done dumping result, returning 51385 1727204587.18522: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-6b1f-5706-000000000200] 51385 1727204587.18527: sending task result for task 0affcd87-79f5-6b1f-5706-000000000200 51385 1727204587.18607: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000200 51385 1727204587.18614: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 51385 1727204587.18671: no more pending results, returning what we have 51385 1727204587.18674: results queue empty 51385 1727204587.18675: checking for any_errors_fatal 51385 1727204587.18681: done checking for any_errors_fatal 51385 1727204587.18682: checking for max_fail_percentage 51385 1727204587.18684: done checking for max_fail_percentage 51385 1727204587.18684: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.18685: done checking to see if all hosts have failed 51385 1727204587.18686: getting the remaining hosts for this loop 51385 1727204587.18688: done getting the remaining hosts for this loop 51385 1727204587.18691: getting the next task for host managed-node1 51385 1727204587.18699: done getting next task for host managed-node1 51385 1727204587.18701: ^ task is: TASK: Show current_interfaces 51385 1727204587.18704: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.18707: getting variables 51385 1727204587.18709: in VariableManager get_vars() 51385 1727204587.18750: Calling all_inventory to load vars for managed-node1 51385 1727204587.18752: Calling groups_inventory to load vars for managed-node1 51385 1727204587.18754: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.18767: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.18770: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.18772: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.18930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.19057: done with get_vars() 51385 1727204587.19068: done getting variables 51385 1727204587.19134: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.021) 0:00:05.595 ***** 51385 1727204587.19154: entering _queue_task() for managed-node1/debug 51385 1727204587.19156: Creating lock for debug 51385 1727204587.19358: worker is 1 (out of 1 available) 51385 1727204587.19375: exiting _queue_task() for managed-node1/debug 51385 1727204587.19386: done queuing things up, now waiting for results queue to drain 51385 1727204587.19387: waiting for pending results... 51385 1727204587.19534: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 51385 1727204587.19598: in run() - task 0affcd87-79f5-6b1f-5706-000000000121 51385 1727204587.19616: variable 'ansible_search_path' from source: unknown 51385 1727204587.19620: variable 'ansible_search_path' from source: unknown 51385 1727204587.19649: calling self._execute() 51385 1727204587.19719: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.19723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.19732: variable 'omit' from source: magic vars 51385 1727204587.20009: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.20019: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.20026: variable 'omit' from source: magic vars 51385 1727204587.20055: variable 'omit' from source: magic vars 51385 1727204587.20126: variable 'current_interfaces' from source: set_fact 51385 1727204587.20149: variable 'omit' from source: magic vars 51385 1727204587.20187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.20213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.20229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.20242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.20253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.20280: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.20283: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.20286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.20359: Set connection var ansible_pipelining to False 51385 1727204587.20362: Set connection var ansible_shell_type to sh 51385 1727204587.20372: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.20379: Set connection var ansible_timeout to 10 51385 1727204587.20384: Set connection var ansible_connection to ssh 51385 1727204587.20390: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.20407: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.20410: variable 'ansible_connection' from source: unknown 51385 1727204587.20412: variable 'ansible_module_compression' from source: unknown 51385 1727204587.20415: variable 'ansible_shell_type' from source: unknown 51385 1727204587.20417: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.20420: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.20422: variable 'ansible_pipelining' from source: unknown 51385 1727204587.20424: variable 'ansible_timeout' from source: unknown 51385 1727204587.20429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.20532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.20541: variable 'omit' from source: magic vars 51385 1727204587.20546: starting attempt loop 51385 1727204587.20549: running the handler 51385 1727204587.20588: handler run complete 51385 1727204587.20599: attempt loop complete, returning result 51385 1727204587.20602: _execute() done 51385 1727204587.20605: dumping result to json 51385 1727204587.20608: done dumping result, returning 51385 1727204587.20613: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-6b1f-5706-000000000121] 51385 1727204587.20618: sending task result for task 0affcd87-79f5-6b1f-5706-000000000121 51385 1727204587.20700: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000121 51385 1727204587.20703: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 51385 1727204587.20746: no more pending results, returning what we have 51385 1727204587.20749: results queue empty 51385 1727204587.20750: checking for any_errors_fatal 51385 1727204587.20755: done checking for any_errors_fatal 51385 1727204587.20756: checking for max_fail_percentage 51385 1727204587.20757: done checking for max_fail_percentage 51385 1727204587.20758: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.20759: done checking to see if all hosts have failed 51385 1727204587.20760: getting the remaining hosts for this loop 51385 1727204587.20766: done getting the remaining hosts for this loop 51385 1727204587.20770: getting the next task for host managed-node1 51385 1727204587.20777: done getting next task for host managed-node1 51385 1727204587.20780: ^ task is: TASK: Include the task 'manage_test_interface.yml' 51385 1727204587.20782: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.20785: getting variables 51385 1727204587.20786: in VariableManager get_vars() 51385 1727204587.20828: Calling all_inventory to load vars for managed-node1 51385 1727204587.20830: Calling groups_inventory to load vars for managed-node1 51385 1727204587.20832: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.20843: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.20845: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.20848: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.20975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.21105: done with get_vars() 51385 1727204587.21113: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.020) 0:00:05.615 ***** 51385 1727204587.21180: entering _queue_task() for managed-node1/include_tasks 51385 1727204587.21371: worker is 1 (out of 1 available) 51385 1727204587.21386: exiting _queue_task() for managed-node1/include_tasks 51385 1727204587.21397: done queuing things up, now waiting for results queue to drain 51385 1727204587.21398: waiting for pending results... 51385 1727204587.21593: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 51385 1727204587.21670: in run() - task 0affcd87-79f5-6b1f-5706-00000000000c 51385 1727204587.21681: variable 'ansible_search_path' from source: unknown 51385 1727204587.21708: calling self._execute() 51385 1727204587.21774: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.21780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.21789: variable 'omit' from source: magic vars 51385 1727204587.22113: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.22121: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.22126: _execute() done 51385 1727204587.22129: dumping result to json 51385 1727204587.22133: done dumping result, returning 51385 1727204587.22139: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-6b1f-5706-00000000000c] 51385 1727204587.22145: sending task result for task 0affcd87-79f5-6b1f-5706-00000000000c 51385 1727204587.22226: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000000c 51385 1727204587.22229: WORKER PROCESS EXITING 51385 1727204587.22256: no more pending results, returning what we have 51385 1727204587.22263: in VariableManager get_vars() 51385 1727204587.22309: Calling all_inventory to load vars for managed-node1 51385 1727204587.22312: Calling groups_inventory to load vars for managed-node1 51385 1727204587.22315: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.22326: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.22328: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.22331: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.22506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.22629: done with get_vars() 51385 1727204587.22635: variable 'ansible_search_path' from source: unknown 51385 1727204587.22644: we have included files to process 51385 1727204587.22645: generating all_blocks data 51385 1727204587.22646: done generating all_blocks data 51385 1727204587.22649: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204587.22650: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204587.22651: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204587.23018: in VariableManager get_vars() 51385 1727204587.23040: done with get_vars() 51385 1727204587.23244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 51385 1727204587.23812: done processing included file 51385 1727204587.23814: iterating over new_blocks loaded from include file 51385 1727204587.23816: in VariableManager get_vars() 51385 1727204587.23834: done with get_vars() 51385 1727204587.23836: filtering new block on tags 51385 1727204587.23871: done filtering new block on tags 51385 1727204587.23874: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 51385 1727204587.23879: extending task lists for all hosts with included blocks 51385 1727204587.25367: done extending task lists 51385 1727204587.25369: done processing included files 51385 1727204587.25369: results queue empty 51385 1727204587.25370: checking for any_errors_fatal 51385 1727204587.25372: done checking for any_errors_fatal 51385 1727204587.25373: checking for max_fail_percentage 51385 1727204587.25373: done checking for max_fail_percentage 51385 1727204587.25374: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.25375: done checking to see if all hosts have failed 51385 1727204587.25375: getting the remaining hosts for this loop 51385 1727204587.25376: done getting the remaining hosts for this loop 51385 1727204587.25378: getting the next task for host managed-node1 51385 1727204587.25381: done getting next task for host managed-node1 51385 1727204587.25382: ^ task is: TASK: Ensure state in ["present", "absent"] 51385 1727204587.25384: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.25385: getting variables 51385 1727204587.25386: in VariableManager get_vars() 51385 1727204587.25397: Calling all_inventory to load vars for managed-node1 51385 1727204587.25399: Calling groups_inventory to load vars for managed-node1 51385 1727204587.25400: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.25404: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.25406: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.25407: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.25504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.25637: done with get_vars() 51385 1727204587.25646: done getting variables 51385 1727204587.25697: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.045) 0:00:05.661 ***** 51385 1727204587.25717: entering _queue_task() for managed-node1/fail 51385 1727204587.25718: Creating lock for fail 51385 1727204587.25938: worker is 1 (out of 1 available) 51385 1727204587.25952: exiting _queue_task() for managed-node1/fail 51385 1727204587.25963: done queuing things up, now waiting for results queue to drain 51385 1727204587.25966: waiting for pending results... 51385 1727204587.26126: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 51385 1727204587.26190: in run() - task 0affcd87-79f5-6b1f-5706-00000000021b 51385 1727204587.26202: variable 'ansible_search_path' from source: unknown 51385 1727204587.26206: variable 'ansible_search_path' from source: unknown 51385 1727204587.26235: calling self._execute() 51385 1727204587.26313: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.26329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.26342: variable 'omit' from source: magic vars 51385 1727204587.26699: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.26718: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.26865: variable 'state' from source: include params 51385 1727204587.26878: Evaluated conditional (state not in ["present", "absent"]): False 51385 1727204587.26885: when evaluation is False, skipping this task 51385 1727204587.26891: _execute() done 51385 1727204587.26898: dumping result to json 51385 1727204587.26904: done dumping result, returning 51385 1727204587.26913: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-6b1f-5706-00000000021b] 51385 1727204587.26923: sending task result for task 0affcd87-79f5-6b1f-5706-00000000021b 51385 1727204587.27029: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000021b 51385 1727204587.27037: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 51385 1727204587.27323: no more pending results, returning what we have 51385 1727204587.27326: results queue empty 51385 1727204587.27327: checking for any_errors_fatal 51385 1727204587.27329: done checking for any_errors_fatal 51385 1727204587.27330: checking for max_fail_percentage 51385 1727204587.27331: done checking for max_fail_percentage 51385 1727204587.27332: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.27333: done checking to see if all hosts have failed 51385 1727204587.27334: getting the remaining hosts for this loop 51385 1727204587.27335: done getting the remaining hosts for this loop 51385 1727204587.27339: getting the next task for host managed-node1 51385 1727204587.27344: done getting next task for host managed-node1 51385 1727204587.27346: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 51385 1727204587.27349: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.27352: getting variables 51385 1727204587.27354: in VariableManager get_vars() 51385 1727204587.27392: Calling all_inventory to load vars for managed-node1 51385 1727204587.27395: Calling groups_inventory to load vars for managed-node1 51385 1727204587.27397: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.27406: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.27408: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.27410: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.27578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.27772: done with get_vars() 51385 1727204587.27783: done getting variables 51385 1727204587.27838: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.021) 0:00:05.682 ***** 51385 1727204587.27870: entering _queue_task() for managed-node1/fail 51385 1727204587.28119: worker is 1 (out of 1 available) 51385 1727204587.28133: exiting _queue_task() for managed-node1/fail 51385 1727204587.28144: done queuing things up, now waiting for results queue to drain 51385 1727204587.28145: waiting for pending results... 51385 1727204587.28406: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 51385 1727204587.28516: in run() - task 0affcd87-79f5-6b1f-5706-00000000021c 51385 1727204587.28535: variable 'ansible_search_path' from source: unknown 51385 1727204587.28542: variable 'ansible_search_path' from source: unknown 51385 1727204587.28592: calling self._execute() 51385 1727204587.28681: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.28694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.28712: variable 'omit' from source: magic vars 51385 1727204587.29085: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.29103: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.29259: variable 'type' from source: play vars 51385 1727204587.29276: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 51385 1727204587.29283: when evaluation is False, skipping this task 51385 1727204587.29290: _execute() done 51385 1727204587.29296: dumping result to json 51385 1727204587.29302: done dumping result, returning 51385 1727204587.29311: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-6b1f-5706-00000000021c] 51385 1727204587.29321: sending task result for task 0affcd87-79f5-6b1f-5706-00000000021c skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 51385 1727204587.29456: no more pending results, returning what we have 51385 1727204587.29463: results queue empty 51385 1727204587.29465: checking for any_errors_fatal 51385 1727204587.29471: done checking for any_errors_fatal 51385 1727204587.29472: checking for max_fail_percentage 51385 1727204587.29474: done checking for max_fail_percentage 51385 1727204587.29475: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.29476: done checking to see if all hosts have failed 51385 1727204587.29477: getting the remaining hosts for this loop 51385 1727204587.29479: done getting the remaining hosts for this loop 51385 1727204587.29483: getting the next task for host managed-node1 51385 1727204587.29491: done getting next task for host managed-node1 51385 1727204587.29494: ^ task is: TASK: Include the task 'show_interfaces.yml' 51385 1727204587.29498: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.29502: getting variables 51385 1727204587.29503: in VariableManager get_vars() 51385 1727204587.29545: Calling all_inventory to load vars for managed-node1 51385 1727204587.29548: Calling groups_inventory to load vars for managed-node1 51385 1727204587.29550: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.29568: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.29571: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.29574: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.29831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.30313: done with get_vars() 51385 1727204587.30324: done getting variables 51385 1727204587.30355: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000021c 51385 1727204587.30358: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.025) 0:00:05.708 ***** 51385 1727204587.30437: entering _queue_task() for managed-node1/include_tasks 51385 1727204587.30692: worker is 1 (out of 1 available) 51385 1727204587.30708: exiting _queue_task() for managed-node1/include_tasks 51385 1727204587.30719: done queuing things up, now waiting for results queue to drain 51385 1727204587.30720: waiting for pending results... 51385 1727204587.30986: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 51385 1727204587.31082: in run() - task 0affcd87-79f5-6b1f-5706-00000000021d 51385 1727204587.31099: variable 'ansible_search_path' from source: unknown 51385 1727204587.31105: variable 'ansible_search_path' from source: unknown 51385 1727204587.31142: calling self._execute() 51385 1727204587.31238: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.31251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.31275: variable 'omit' from source: magic vars 51385 1727204587.31675: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.31694: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.31709: _execute() done 51385 1727204587.31717: dumping result to json 51385 1727204587.31725: done dumping result, returning 51385 1727204587.31735: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-6b1f-5706-00000000021d] 51385 1727204587.31746: sending task result for task 0affcd87-79f5-6b1f-5706-00000000021d 51385 1727204587.31883: no more pending results, returning what we have 51385 1727204587.31889: in VariableManager get_vars() 51385 1727204587.31939: Calling all_inventory to load vars for managed-node1 51385 1727204587.31943: Calling groups_inventory to load vars for managed-node1 51385 1727204587.31946: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.31965: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.31969: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.31973: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.32192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.32411: done with get_vars() 51385 1727204587.32421: variable 'ansible_search_path' from source: unknown 51385 1727204587.32422: variable 'ansible_search_path' from source: unknown 51385 1727204587.32711: we have included files to process 51385 1727204587.32713: generating all_blocks data 51385 1727204587.32715: done generating all_blocks data 51385 1727204587.32720: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204587.32721: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204587.32727: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000021d 51385 1727204587.32730: WORKER PROCESS EXITING 51385 1727204587.32733: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204587.32843: in VariableManager get_vars() 51385 1727204587.32872: done with get_vars() 51385 1727204587.32989: done processing included file 51385 1727204587.32991: iterating over new_blocks loaded from include file 51385 1727204587.32993: in VariableManager get_vars() 51385 1727204587.33012: done with get_vars() 51385 1727204587.33014: filtering new block on tags 51385 1727204587.33033: done filtering new block on tags 51385 1727204587.33036: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 51385 1727204587.33041: extending task lists for all hosts with included blocks 51385 1727204587.33515: done extending task lists 51385 1727204587.33516: done processing included files 51385 1727204587.33517: results queue empty 51385 1727204587.33518: checking for any_errors_fatal 51385 1727204587.33521: done checking for any_errors_fatal 51385 1727204587.33522: checking for max_fail_percentage 51385 1727204587.33523: done checking for max_fail_percentage 51385 1727204587.33524: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.33525: done checking to see if all hosts have failed 51385 1727204587.33525: getting the remaining hosts for this loop 51385 1727204587.33527: done getting the remaining hosts for this loop 51385 1727204587.33529: getting the next task for host managed-node1 51385 1727204587.33534: done getting next task for host managed-node1 51385 1727204587.33536: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 51385 1727204587.33539: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.33541: getting variables 51385 1727204587.33542: in VariableManager get_vars() 51385 1727204587.33555: Calling all_inventory to load vars for managed-node1 51385 1727204587.33558: Calling groups_inventory to load vars for managed-node1 51385 1727204587.33560: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.33569: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.33572: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.33575: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.33723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.33936: done with get_vars() 51385 1727204587.33946: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.035) 0:00:05.744 ***** 51385 1727204587.34025: entering _queue_task() for managed-node1/include_tasks 51385 1727204587.34322: worker is 1 (out of 1 available) 51385 1727204587.34335: exiting _queue_task() for managed-node1/include_tasks 51385 1727204587.34347: done queuing things up, now waiting for results queue to drain 51385 1727204587.34348: waiting for pending results... 51385 1727204587.34614: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 51385 1727204587.34726: in run() - task 0affcd87-79f5-6b1f-5706-000000000314 51385 1727204587.34744: variable 'ansible_search_path' from source: unknown 51385 1727204587.34751: variable 'ansible_search_path' from source: unknown 51385 1727204587.34802: calling self._execute() 51385 1727204587.34895: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.34909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.34923: variable 'omit' from source: magic vars 51385 1727204587.35306: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.35325: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.35342: _execute() done 51385 1727204587.35350: dumping result to json 51385 1727204587.35359: done dumping result, returning 51385 1727204587.35375: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-6b1f-5706-000000000314] 51385 1727204587.35387: sending task result for task 0affcd87-79f5-6b1f-5706-000000000314 51385 1727204587.35512: no more pending results, returning what we have 51385 1727204587.35518: in VariableManager get_vars() 51385 1727204587.35574: Calling all_inventory to load vars for managed-node1 51385 1727204587.35578: Calling groups_inventory to load vars for managed-node1 51385 1727204587.35580: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.35595: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.35598: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.35601: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.35817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.36081: done with get_vars() 51385 1727204587.36089: variable 'ansible_search_path' from source: unknown 51385 1727204587.36090: variable 'ansible_search_path' from source: unknown 51385 1727204587.36153: we have included files to process 51385 1727204587.36154: generating all_blocks data 51385 1727204587.36156: done generating all_blocks data 51385 1727204587.36157: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204587.36158: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204587.36426: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204587.36529: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000314 51385 1727204587.36533: WORKER PROCESS EXITING 51385 1727204587.36716: done processing included file 51385 1727204587.36718: iterating over new_blocks loaded from include file 51385 1727204587.36720: in VariableManager get_vars() 51385 1727204587.36741: done with get_vars() 51385 1727204587.36743: filtering new block on tags 51385 1727204587.36768: done filtering new block on tags 51385 1727204587.36770: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 51385 1727204587.36776: extending task lists for all hosts with included blocks 51385 1727204587.36932: done extending task lists 51385 1727204587.36934: done processing included files 51385 1727204587.36935: results queue empty 51385 1727204587.36935: checking for any_errors_fatal 51385 1727204587.36938: done checking for any_errors_fatal 51385 1727204587.36939: checking for max_fail_percentage 51385 1727204587.36940: done checking for max_fail_percentage 51385 1727204587.36941: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.36942: done checking to see if all hosts have failed 51385 1727204587.36943: getting the remaining hosts for this loop 51385 1727204587.36944: done getting the remaining hosts for this loop 51385 1727204587.36947: getting the next task for host managed-node1 51385 1727204587.36951: done getting next task for host managed-node1 51385 1727204587.36954: ^ task is: TASK: Gather current interface info 51385 1727204587.36957: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.36959: getting variables 51385 1727204587.36963: in VariableManager get_vars() 51385 1727204587.36978: Calling all_inventory to load vars for managed-node1 51385 1727204587.36981: Calling groups_inventory to load vars for managed-node1 51385 1727204587.36983: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.36988: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.36991: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.36994: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.37147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.37357: done with get_vars() 51385 1727204587.37373: done getting variables 51385 1727204587.37412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.034) 0:00:05.778 ***** 51385 1727204587.37442: entering _queue_task() for managed-node1/command 51385 1727204587.37743: worker is 1 (out of 1 available) 51385 1727204587.37758: exiting _queue_task() for managed-node1/command 51385 1727204587.37772: done queuing things up, now waiting for results queue to drain 51385 1727204587.37773: waiting for pending results... 51385 1727204587.38042: running TaskExecutor() for managed-node1/TASK: Gather current interface info 51385 1727204587.38157: in run() - task 0affcd87-79f5-6b1f-5706-00000000034b 51385 1727204587.38184: variable 'ansible_search_path' from source: unknown 51385 1727204587.38191: variable 'ansible_search_path' from source: unknown 51385 1727204587.38234: calling self._execute() 51385 1727204587.38314: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.38327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.38339: variable 'omit' from source: magic vars 51385 1727204587.38776: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.38795: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.38805: variable 'omit' from source: magic vars 51385 1727204587.38864: variable 'omit' from source: magic vars 51385 1727204587.38906: variable 'omit' from source: magic vars 51385 1727204587.38949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.39000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.39025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.39046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.39065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.39103: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.39110: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.39117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.39225: Set connection var ansible_pipelining to False 51385 1727204587.39233: Set connection var ansible_shell_type to sh 51385 1727204587.39248: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.39259: Set connection var ansible_timeout to 10 51385 1727204587.39272: Set connection var ansible_connection to ssh 51385 1727204587.39281: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.39312: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.39320: variable 'ansible_connection' from source: unknown 51385 1727204587.39327: variable 'ansible_module_compression' from source: unknown 51385 1727204587.39332: variable 'ansible_shell_type' from source: unknown 51385 1727204587.39338: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.39344: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.39351: variable 'ansible_pipelining' from source: unknown 51385 1727204587.39356: variable 'ansible_timeout' from source: unknown 51385 1727204587.39368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.39514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.39534: variable 'omit' from source: magic vars 51385 1727204587.39543: starting attempt loop 51385 1727204587.39549: running the handler 51385 1727204587.39572: _low_level_execute_command(): starting 51385 1727204587.39584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204587.40363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204587.40385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.40403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.40422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.40471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.40483: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204587.40497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.40519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204587.40530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204587.40540: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204587.40553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.40575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.40591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.40602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.40616: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204587.40630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.40711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.40738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.40752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.40850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.42429: stdout chunk (state=3): >>>/root <<< 51385 1727204587.42630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.42633: stdout chunk (state=3): >>><<< 51385 1727204587.42636: stderr chunk (state=3): >>><<< 51385 1727204587.42671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.42772: _low_level_execute_command(): starting 51385 1727204587.42777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653 `" && echo ansible-tmp-1727204587.4265947-51850-111613953924653="` echo /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653 `" ) && sleep 0' 51385 1727204587.43400: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204587.43414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.43432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.43451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.43500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.43512: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204587.43526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.43545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204587.43555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204587.43572: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204587.43586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.43600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.43615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.43626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.43637: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204587.43652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.43732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.43762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.43781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.43875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.45713: stdout chunk (state=3): >>>ansible-tmp-1727204587.4265947-51850-111613953924653=/root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653 <<< 51385 1727204587.45884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.45953: stderr chunk (state=3): >>><<< 51385 1727204587.45956: stdout chunk (state=3): >>><<< 51385 1727204587.46173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204587.4265947-51850-111613953924653=/root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.46177: variable 'ansible_module_compression' from source: unknown 51385 1727204587.46179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204587.46182: variable 'ansible_facts' from source: unknown 51385 1727204587.46209: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/AnsiballZ_command.py 51385 1727204587.46396: Sending initial data 51385 1727204587.46400: Sent initial data (156 bytes) 51385 1727204587.47757: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.47763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.47798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.47802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.47804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.47871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.47876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.47892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.47952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.49682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204587.49727: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204587.49782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpr1z3rcix /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/AnsiballZ_command.py <<< 51385 1727204587.49834: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204587.51085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.51351: stderr chunk (state=3): >>><<< 51385 1727204587.51354: stdout chunk (state=3): >>><<< 51385 1727204587.51357: done transferring module to remote 51385 1727204587.51359: _low_level_execute_command(): starting 51385 1727204587.51370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/ /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/AnsiballZ_command.py && sleep 0' 51385 1727204587.51993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204587.52008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.52024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.52047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.52095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.52106: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204587.52121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.52142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204587.52152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204587.52165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204587.52176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.52188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.52201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.52211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.52219: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204587.52233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.52317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.52338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.52359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.52444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.54250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.54254: stdout chunk (state=3): >>><<< 51385 1727204587.54257: stderr chunk (state=3): >>><<< 51385 1727204587.54368: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.54372: _low_level_execute_command(): starting 51385 1727204587.54375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/AnsiballZ_command.py && sleep 0' 51385 1727204587.55419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.55423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.55460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204587.55469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.55472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.55700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.55787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.56062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.69437: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:07.690711", "end": "2024-09-24 15:03:07.693863", "delta": "0:00:00.003152", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204587.70657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204587.70663: stdout chunk (state=3): >>><<< 51385 1727204587.70667: stderr chunk (state=3): >>><<< 51385 1727204587.70819: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:07.690711", "end": "2024-09-24 15:03:07.693863", "delta": "0:00:00.003152", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204587.70823: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204587.70830: _low_level_execute_command(): starting 51385 1727204587.70832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204587.4265947-51850-111613953924653/ > /dev/null 2>&1 && sleep 0' 51385 1727204587.72026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.72030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.72061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204587.72066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.72069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.72145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.72148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.72151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.72212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.74103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.74138: stderr chunk (state=3): >>><<< 51385 1727204587.74141: stdout chunk (state=3): >>><<< 51385 1727204587.74374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.74378: handler run complete 51385 1727204587.74380: Evaluated conditional (False): False 51385 1727204587.74384: attempt loop complete, returning result 51385 1727204587.74387: _execute() done 51385 1727204587.74389: dumping result to json 51385 1727204587.74390: done dumping result, returning 51385 1727204587.74392: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-6b1f-5706-00000000034b] 51385 1727204587.74394: sending task result for task 0affcd87-79f5-6b1f-5706-00000000034b 51385 1727204587.74472: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000034b 51385 1727204587.74477: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003152", "end": "2024-09-24 15:03:07.693863", "rc": 0, "start": "2024-09-24 15:03:07.690711" } STDOUT: bonding_masters eth0 lo rpltstbr 51385 1727204587.74560: no more pending results, returning what we have 51385 1727204587.74565: results queue empty 51385 1727204587.74566: checking for any_errors_fatal 51385 1727204587.74568: done checking for any_errors_fatal 51385 1727204587.74568: checking for max_fail_percentage 51385 1727204587.74570: done checking for max_fail_percentage 51385 1727204587.74571: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.74572: done checking to see if all hosts have failed 51385 1727204587.74573: getting the remaining hosts for this loop 51385 1727204587.74575: done getting the remaining hosts for this loop 51385 1727204587.74578: getting the next task for host managed-node1 51385 1727204587.74587: done getting next task for host managed-node1 51385 1727204587.74590: ^ task is: TASK: Set current_interfaces 51385 1727204587.74595: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.74599: getting variables 51385 1727204587.74601: in VariableManager get_vars() 51385 1727204587.74703: Calling all_inventory to load vars for managed-node1 51385 1727204587.74706: Calling groups_inventory to load vars for managed-node1 51385 1727204587.74708: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.74719: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.74722: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.74725: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.75093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.75329: done with get_vars() 51385 1727204587.75341: done getting variables 51385 1727204587.75412: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.380) 0:00:06.158 ***** 51385 1727204587.75446: entering _queue_task() for managed-node1/set_fact 51385 1727204587.75748: worker is 1 (out of 1 available) 51385 1727204587.75762: exiting _queue_task() for managed-node1/set_fact 51385 1727204587.75775: done queuing things up, now waiting for results queue to drain 51385 1727204587.75777: waiting for pending results... 51385 1727204587.76059: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 51385 1727204587.76190: in run() - task 0affcd87-79f5-6b1f-5706-00000000034c 51385 1727204587.76211: variable 'ansible_search_path' from source: unknown 51385 1727204587.76224: variable 'ansible_search_path' from source: unknown 51385 1727204587.76268: calling self._execute() 51385 1727204587.76363: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.76377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.76391: variable 'omit' from source: magic vars 51385 1727204587.77060: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.77082: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.77097: variable 'omit' from source: magic vars 51385 1727204587.77157: variable 'omit' from source: magic vars 51385 1727204587.77478: variable '_current_interfaces' from source: set_fact 51385 1727204587.77555: variable 'omit' from source: magic vars 51385 1727204587.77718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.77762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.77837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.77863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.77937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.77978: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.78043: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.78051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.78184: Set connection var ansible_pipelining to False 51385 1727204587.78198: Set connection var ansible_shell_type to sh 51385 1727204587.78221: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.78235: Set connection var ansible_timeout to 10 51385 1727204587.78241: Set connection var ansible_connection to ssh 51385 1727204587.78257: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.78299: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.78307: variable 'ansible_connection' from source: unknown 51385 1727204587.78313: variable 'ansible_module_compression' from source: unknown 51385 1727204587.78319: variable 'ansible_shell_type' from source: unknown 51385 1727204587.78325: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.78331: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.78339: variable 'ansible_pipelining' from source: unknown 51385 1727204587.78345: variable 'ansible_timeout' from source: unknown 51385 1727204587.78352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.78515: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.78533: variable 'omit' from source: magic vars 51385 1727204587.78543: starting attempt loop 51385 1727204587.78550: running the handler 51385 1727204587.78568: handler run complete 51385 1727204587.78588: attempt loop complete, returning result 51385 1727204587.78595: _execute() done 51385 1727204587.78605: dumping result to json 51385 1727204587.78613: done dumping result, returning 51385 1727204587.78624: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-6b1f-5706-00000000034c] 51385 1727204587.78634: sending task result for task 0affcd87-79f5-6b1f-5706-00000000034c ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 51385 1727204587.78793: no more pending results, returning what we have 51385 1727204587.78797: results queue empty 51385 1727204587.78798: checking for any_errors_fatal 51385 1727204587.78805: done checking for any_errors_fatal 51385 1727204587.78806: checking for max_fail_percentage 51385 1727204587.78808: done checking for max_fail_percentage 51385 1727204587.78809: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.78810: done checking to see if all hosts have failed 51385 1727204587.78811: getting the remaining hosts for this loop 51385 1727204587.78813: done getting the remaining hosts for this loop 51385 1727204587.78817: getting the next task for host managed-node1 51385 1727204587.78826: done getting next task for host managed-node1 51385 1727204587.78829: ^ task is: TASK: Show current_interfaces 51385 1727204587.78833: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.78841: getting variables 51385 1727204587.78843: in VariableManager get_vars() 51385 1727204587.78893: Calling all_inventory to load vars for managed-node1 51385 1727204587.78896: Calling groups_inventory to load vars for managed-node1 51385 1727204587.78899: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.78911: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.78914: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.78917: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.79128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.79363: done with get_vars() 51385 1727204587.79426: done getting variables 51385 1727204587.79526: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000034c 51385 1727204587.79529: WORKER PROCESS EXITING 51385 1727204587.79572: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.041) 0:00:06.200 ***** 51385 1727204587.79609: entering _queue_task() for managed-node1/debug 51385 1727204587.80139: worker is 1 (out of 1 available) 51385 1727204587.80152: exiting _queue_task() for managed-node1/debug 51385 1727204587.80163: done queuing things up, now waiting for results queue to drain 51385 1727204587.80166: waiting for pending results... 51385 1727204587.80429: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 51385 1727204587.80546: in run() - task 0affcd87-79f5-6b1f-5706-000000000315 51385 1727204587.80566: variable 'ansible_search_path' from source: unknown 51385 1727204587.80575: variable 'ansible_search_path' from source: unknown 51385 1727204587.80648: calling self._execute() 51385 1727204587.80992: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.81055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.81072: variable 'omit' from source: magic vars 51385 1727204587.81762: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.81833: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.81877: variable 'omit' from source: magic vars 51385 1727204587.82041: variable 'omit' from source: magic vars 51385 1727204587.82162: variable 'current_interfaces' from source: set_fact 51385 1727204587.82207: variable 'omit' from source: magic vars 51385 1727204587.82269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.82310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.82337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.82368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.82388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.82421: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.82429: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.82435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.82543: Set connection var ansible_pipelining to False 51385 1727204587.82552: Set connection var ansible_shell_type to sh 51385 1727204587.82578: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.82592: Set connection var ansible_timeout to 10 51385 1727204587.82600: Set connection var ansible_connection to ssh 51385 1727204587.82611: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.82638: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.82646: variable 'ansible_connection' from source: unknown 51385 1727204587.82652: variable 'ansible_module_compression' from source: unknown 51385 1727204587.82659: variable 'ansible_shell_type' from source: unknown 51385 1727204587.82671: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.82679: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.82691: variable 'ansible_pipelining' from source: unknown 51385 1727204587.82699: variable 'ansible_timeout' from source: unknown 51385 1727204587.82707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.82853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.82872: variable 'omit' from source: magic vars 51385 1727204587.82880: starting attempt loop 51385 1727204587.82890: running the handler 51385 1727204587.82942: handler run complete 51385 1727204587.82961: attempt loop complete, returning result 51385 1727204587.82970: _execute() done 51385 1727204587.82977: dumping result to json 51385 1727204587.82984: done dumping result, returning 51385 1727204587.82999: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-6b1f-5706-000000000315] 51385 1727204587.83016: sending task result for task 0affcd87-79f5-6b1f-5706-000000000315 ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 51385 1727204587.83230: no more pending results, returning what we have 51385 1727204587.83234: results queue empty 51385 1727204587.83235: checking for any_errors_fatal 51385 1727204587.83241: done checking for any_errors_fatal 51385 1727204587.83242: checking for max_fail_percentage 51385 1727204587.83244: done checking for max_fail_percentage 51385 1727204587.83245: checking to see if all hosts have failed and the running result is not ok 51385 1727204587.83246: done checking to see if all hosts have failed 51385 1727204587.83246: getting the remaining hosts for this loop 51385 1727204587.83248: done getting the remaining hosts for this loop 51385 1727204587.83252: getting the next task for host managed-node1 51385 1727204587.83260: done getting next task for host managed-node1 51385 1727204587.83268: ^ task is: TASK: Install iproute 51385 1727204587.83272: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204587.83276: getting variables 51385 1727204587.83278: in VariableManager get_vars() 51385 1727204587.83313: Calling all_inventory to load vars for managed-node1 51385 1727204587.83316: Calling groups_inventory to load vars for managed-node1 51385 1727204587.83319: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204587.83330: Calling all_plugins_play to load vars for managed-node1 51385 1727204587.83333: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204587.83335: Calling groups_plugins_play to load vars for managed-node1 51385 1727204587.83522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204587.83753: done with get_vars() 51385 1727204587.83766: done getting variables 51385 1727204587.83930: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000315 51385 1727204587.83933: WORKER PROCESS EXITING 51385 1727204587.83952: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.043) 0:00:06.243 ***** 51385 1727204587.83990: entering _queue_task() for managed-node1/package 51385 1727204587.84407: worker is 1 (out of 1 available) 51385 1727204587.84423: exiting _queue_task() for managed-node1/package 51385 1727204587.84433: done queuing things up, now waiting for results queue to drain 51385 1727204587.84434: waiting for pending results... 51385 1727204587.84696: running TaskExecutor() for managed-node1/TASK: Install iproute 51385 1727204587.84802: in run() - task 0affcd87-79f5-6b1f-5706-00000000021e 51385 1727204587.84823: variable 'ansible_search_path' from source: unknown 51385 1727204587.84831: variable 'ansible_search_path' from source: unknown 51385 1727204587.84876: calling self._execute() 51385 1727204587.84963: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.84977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.84996: variable 'omit' from source: magic vars 51385 1727204587.85388: variable 'ansible_distribution_major_version' from source: facts 51385 1727204587.85405: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204587.85414: variable 'omit' from source: magic vars 51385 1727204587.85462: variable 'omit' from source: magic vars 51385 1727204587.85677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204587.88275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204587.88344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204587.88393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204587.88431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204587.88465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204587.88572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204587.88614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204587.88645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204587.88700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204587.88722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204587.88840: variable '__network_is_ostree' from source: set_fact 51385 1727204587.88851: variable 'omit' from source: magic vars 51385 1727204587.88888: variable 'omit' from source: magic vars 51385 1727204587.88928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204587.88959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204587.88985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204587.89008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.89029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204587.89068: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204587.89077: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.89084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.89197: Set connection var ansible_pipelining to False 51385 1727204587.89206: Set connection var ansible_shell_type to sh 51385 1727204587.89222: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204587.89241: Set connection var ansible_timeout to 10 51385 1727204587.89253: Set connection var ansible_connection to ssh 51385 1727204587.89263: Set connection var ansible_shell_executable to /bin/sh 51385 1727204587.89292: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.89300: variable 'ansible_connection' from source: unknown 51385 1727204587.89308: variable 'ansible_module_compression' from source: unknown 51385 1727204587.89314: variable 'ansible_shell_type' from source: unknown 51385 1727204587.89321: variable 'ansible_shell_executable' from source: unknown 51385 1727204587.89328: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204587.89338: variable 'ansible_pipelining' from source: unknown 51385 1727204587.89347: variable 'ansible_timeout' from source: unknown 51385 1727204587.89357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204587.89466: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204587.89486: variable 'omit' from source: magic vars 51385 1727204587.89496: starting attempt loop 51385 1727204587.89502: running the handler 51385 1727204587.89513: variable 'ansible_facts' from source: unknown 51385 1727204587.89519: variable 'ansible_facts' from source: unknown 51385 1727204587.89558: _low_level_execute_command(): starting 51385 1727204587.89584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204587.90353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204587.90373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.90389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.90409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.90461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.90480: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204587.90495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.90515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204587.90528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204587.90543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204587.90555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.90573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.90594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.90606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.90618: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204587.90633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.90719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.90752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.90777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.90878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.92530: stdout chunk (state=3): >>>/root <<< 51385 1727204587.92641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.92737: stderr chunk (state=3): >>><<< 51385 1727204587.92749: stdout chunk (state=3): >>><<< 51385 1727204587.92882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.92885: _low_level_execute_command(): starting 51385 1727204587.92888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688 `" && echo ansible-tmp-1727204587.9278767-51887-215517092316688="` echo /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688 `" ) && sleep 0' 51385 1727204587.93525: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204587.93545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.93562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.93583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.93625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.93644: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204587.93658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.93678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204587.93689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204587.93700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204587.93711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204587.93723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204587.93738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204587.93755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204587.93768: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204587.93783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204587.93861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204587.93889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204587.93904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204587.94004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204587.95887: stdout chunk (state=3): >>>ansible-tmp-1727204587.9278767-51887-215517092316688=/root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688 <<< 51385 1727204587.95999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204587.96092: stderr chunk (state=3): >>><<< 51385 1727204587.96098: stdout chunk (state=3): >>><<< 51385 1727204587.96125: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204587.9278767-51887-215517092316688=/root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204587.96158: variable 'ansible_module_compression' from source: unknown 51385 1727204587.96230: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 51385 1727204587.96234: ANSIBALLZ: Acquiring lock 51385 1727204587.96237: ANSIBALLZ: Lock acquired: 140124837667440 51385 1727204587.96239: ANSIBALLZ: Creating module 51385 1727204588.13777: ANSIBALLZ: Writing module into payload 51385 1727204588.14755: ANSIBALLZ: Writing module 51385 1727204588.14794: ANSIBALLZ: Renaming module 51385 1727204588.14810: ANSIBALLZ: Done creating module 51385 1727204588.14832: variable 'ansible_facts' from source: unknown 51385 1727204588.14928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/AnsiballZ_dnf.py 51385 1727204588.15090: Sending initial data 51385 1727204588.15093: Sent initial data (152 bytes) 51385 1727204588.16078: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204588.16094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204588.16110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204588.16131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204588.16181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204588.16198: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204588.16214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204588.16232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204588.16244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204588.16255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204588.16280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204588.16294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204588.16310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204588.16325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204588.16339: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204588.16354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204588.16436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204588.16462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204588.16482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204588.16576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204588.18299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204588.18347: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204588.18399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpdkd192wn /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/AnsiballZ_dnf.py <<< 51385 1727204588.18448: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204588.20384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204588.20578: stderr chunk (state=3): >>><<< 51385 1727204588.20582: stdout chunk (state=3): >>><<< 51385 1727204588.20584: done transferring module to remote 51385 1727204588.20587: _low_level_execute_command(): starting 51385 1727204588.20589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/ /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/AnsiballZ_dnf.py && sleep 0' 51385 1727204588.22273: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204588.22284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204588.22294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204588.22316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204588.22362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204588.22378: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204588.22439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204588.22459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204588.22475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204588.22487: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204588.22500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204588.22513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204588.22532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204588.22547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204588.22558: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204588.22575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204588.22655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204588.22781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204588.22796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204588.22992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204588.24779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204588.24782: stdout chunk (state=3): >>><<< 51385 1727204588.24785: stderr chunk (state=3): >>><<< 51385 1727204588.24883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204588.24887: _low_level_execute_command(): starting 51385 1727204588.24889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/AnsiballZ_dnf.py && sleep 0' 51385 1727204588.26362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204588.26368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204588.26405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204588.26409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204588.26412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204588.26471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204588.26587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204588.26590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204588.26857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.18111: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 51385 1727204589.22393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204589.22398: stdout chunk (state=3): >>><<< 51385 1727204589.22400: stderr chunk (state=3): >>><<< 51385 1727204589.22469: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204589.22574: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204589.22577: _low_level_execute_command(): starting 51385 1727204589.22580: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204587.9278767-51887-215517092316688/ > /dev/null 2>&1 && sleep 0' 51385 1727204589.23478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.23482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.23515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204589.23519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.23521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204589.23523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.23581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.23593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.23672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.25508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.25587: stderr chunk (state=3): >>><<< 51385 1727204589.25591: stdout chunk (state=3): >>><<< 51385 1727204589.25674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.25678: handler run complete 51385 1727204589.25810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204589.26013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204589.26070: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204589.26107: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204589.26148: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204589.26230: variable '__install_status' from source: unknown 51385 1727204589.26268: Evaluated conditional (__install_status is success): True 51385 1727204589.26295: attempt loop complete, returning result 51385 1727204589.26303: _execute() done 51385 1727204589.26309: dumping result to json 51385 1727204589.26319: done dumping result, returning 51385 1727204589.26331: done running TaskExecutor() for managed-node1/TASK: Install iproute [0affcd87-79f5-6b1f-5706-00000000021e] 51385 1727204589.26341: sending task result for task 0affcd87-79f5-6b1f-5706-00000000021e ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 51385 1727204589.26569: no more pending results, returning what we have 51385 1727204589.26573: results queue empty 51385 1727204589.26576: checking for any_errors_fatal 51385 1727204589.26581: done checking for any_errors_fatal 51385 1727204589.26582: checking for max_fail_percentage 51385 1727204589.26583: done checking for max_fail_percentage 51385 1727204589.26585: checking to see if all hosts have failed and the running result is not ok 51385 1727204589.26586: done checking to see if all hosts have failed 51385 1727204589.26586: getting the remaining hosts for this loop 51385 1727204589.26588: done getting the remaining hosts for this loop 51385 1727204589.26592: getting the next task for host managed-node1 51385 1727204589.26600: done getting next task for host managed-node1 51385 1727204589.26602: ^ task is: TASK: Create veth interface {{ interface }} 51385 1727204589.26605: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204589.26609: getting variables 51385 1727204589.26611: in VariableManager get_vars() 51385 1727204589.26655: Calling all_inventory to load vars for managed-node1 51385 1727204589.26658: Calling groups_inventory to load vars for managed-node1 51385 1727204589.26665: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204589.26677: Calling all_plugins_play to load vars for managed-node1 51385 1727204589.26679: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204589.26682: Calling groups_plugins_play to load vars for managed-node1 51385 1727204589.26921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204589.27159: done with get_vars() 51385 1727204589.27176: done getting variables 51385 1727204589.27322: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000021e 51385 1727204589.27326: WORKER PROCESS EXITING 51385 1727204589.27366: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204589.27613: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:03:09 -0400 (0:00:01.437) 0:00:07.681 ***** 51385 1727204589.27778: entering _queue_task() for managed-node1/command 51385 1727204589.28265: worker is 1 (out of 1 available) 51385 1727204589.28279: exiting _queue_task() for managed-node1/command 51385 1727204589.28290: done queuing things up, now waiting for results queue to drain 51385 1727204589.28291: waiting for pending results... 51385 1727204589.28790: running TaskExecutor() for managed-node1/TASK: Create veth interface lsr101 51385 1727204589.28903: in run() - task 0affcd87-79f5-6b1f-5706-00000000021f 51385 1727204589.28924: variable 'ansible_search_path' from source: unknown 51385 1727204589.28934: variable 'ansible_search_path' from source: unknown 51385 1727204589.29220: variable 'interface' from source: play vars 51385 1727204589.29316: variable 'interface' from source: play vars 51385 1727204589.29395: variable 'interface' from source: play vars 51385 1727204589.29598: Loaded config def from plugin (lookup/items) 51385 1727204589.29610: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 51385 1727204589.29642: variable 'omit' from source: magic vars 51385 1727204589.29777: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.29791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.29805: variable 'omit' from source: magic vars 51385 1727204589.30045: variable 'ansible_distribution_major_version' from source: facts 51385 1727204589.30057: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204589.30274: variable 'type' from source: play vars 51385 1727204589.30285: variable 'state' from source: include params 51385 1727204589.30297: variable 'interface' from source: play vars 51385 1727204589.30305: variable 'current_interfaces' from source: set_fact 51385 1727204589.30315: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 51385 1727204589.30325: variable 'omit' from source: magic vars 51385 1727204589.30365: variable 'omit' from source: magic vars 51385 1727204589.30420: variable 'item' from source: unknown 51385 1727204589.30497: variable 'item' from source: unknown 51385 1727204589.30524: variable 'omit' from source: magic vars 51385 1727204589.30559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204589.30596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204589.30625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204589.30645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204589.30660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204589.30698: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204589.30706: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.30713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.30819: Set connection var ansible_pipelining to False 51385 1727204589.30827: Set connection var ansible_shell_type to sh 51385 1727204589.30848: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204589.30866: Set connection var ansible_timeout to 10 51385 1727204589.30875: Set connection var ansible_connection to ssh 51385 1727204589.30885: Set connection var ansible_shell_executable to /bin/sh 51385 1727204589.30907: variable 'ansible_shell_executable' from source: unknown 51385 1727204589.30915: variable 'ansible_connection' from source: unknown 51385 1727204589.30922: variable 'ansible_module_compression' from source: unknown 51385 1727204589.30929: variable 'ansible_shell_type' from source: unknown 51385 1727204589.30935: variable 'ansible_shell_executable' from source: unknown 51385 1727204589.30949: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.30956: variable 'ansible_pipelining' from source: unknown 51385 1727204589.30967: variable 'ansible_timeout' from source: unknown 51385 1727204589.30976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.31116: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204589.31133: variable 'omit' from source: magic vars 51385 1727204589.31140: starting attempt loop 51385 1727204589.31145: running the handler 51385 1727204589.31171: _low_level_execute_command(): starting 51385 1727204589.31182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204589.31990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.32007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.32023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.32050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.32100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.32112: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.32127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.32153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.32172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.32185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.32198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.32213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.32230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.32243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.32256: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.32281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.32358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.32386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.32402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.32497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.34090: stdout chunk (state=3): >>>/root <<< 51385 1727204589.34289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.34293: stdout chunk (state=3): >>><<< 51385 1727204589.34295: stderr chunk (state=3): >>><<< 51385 1727204589.34406: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.34417: _low_level_execute_command(): starting 51385 1727204589.34420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897 `" && echo ansible-tmp-1727204589.3431509-51978-19648518760897="` echo /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897 `" ) && sleep 0' 51385 1727204589.35093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.35255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.35343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.35368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.35412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.35423: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.35436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.35452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.35468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.35479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.35491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.35503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.35521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.35558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.35589: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.35605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.35684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.35706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.35720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.35874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.37684: stdout chunk (state=3): >>>ansible-tmp-1727204589.3431509-51978-19648518760897=/root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897 <<< 51385 1727204589.37862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.37882: stdout chunk (state=3): >>><<< 51385 1727204589.37885: stderr chunk (state=3): >>><<< 51385 1727204589.38195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204589.3431509-51978-19648518760897=/root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.38199: variable 'ansible_module_compression' from source: unknown 51385 1727204589.38202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204589.38204: variable 'ansible_facts' from source: unknown 51385 1727204589.38206: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/AnsiballZ_command.py 51385 1727204589.38406: Sending initial data 51385 1727204589.38415: Sent initial data (155 bytes) 51385 1727204589.40072: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.40091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.40106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.40125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.40170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.40188: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.40203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.40223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.40229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.40238: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.40248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.40271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.40276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.40285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.40288: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.40304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.40393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.40416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.40432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.40649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.42279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204589.42329: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 51385 1727204589.42350: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204589.42405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpe2we3gu4 /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/AnsiballZ_command.py <<< 51385 1727204589.42461: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204589.44594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.44846: stderr chunk (state=3): >>><<< 51385 1727204589.44850: stdout chunk (state=3): >>><<< 51385 1727204589.44852: done transferring module to remote 51385 1727204589.44854: _low_level_execute_command(): starting 51385 1727204589.44857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/ /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/AnsiballZ_command.py && sleep 0' 51385 1727204589.46829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.46833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.46869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.47005: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204589.47008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204589.47011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.47088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.47120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.47123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.47204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.49041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.49045: stdout chunk (state=3): >>><<< 51385 1727204589.49047: stderr chunk (state=3): >>><<< 51385 1727204589.49172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.49176: _low_level_execute_command(): starting 51385 1727204589.49179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/AnsiballZ_command.py && sleep 0' 51385 1727204589.50799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.50959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.50983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.51002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.51045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.51057: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.51078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.51107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.51120: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.51132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.51144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.51209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.51227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.51241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.51254: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.51275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.51352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.51375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.51390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.51487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.65544: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 15:03:09.644040", "end": "2024-09-24 15:03:09.654509", "delta": "0:00:00.010469", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204589.67454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204589.67459: stdout chunk (state=3): >>><<< 51385 1727204589.67471: stderr chunk (state=3): >>><<< 51385 1727204589.67492: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 15:03:09.644040", "end": "2024-09-24 15:03:09.654509", "delta": "0:00:00.010469", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204589.67535: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204589.67538: _low_level_execute_command(): starting 51385 1727204589.67545: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204589.3431509-51978-19648518760897/ > /dev/null 2>&1 && sleep 0' 51385 1727204589.68441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.68489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.68504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.68523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.68578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.68591: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.68606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.68624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.68637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.68651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.68671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.68686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.68702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.68715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.68726: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.68741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.68823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.68841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.68856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.68972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.72153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.72241: stderr chunk (state=3): >>><<< 51385 1727204589.72245: stdout chunk (state=3): >>><<< 51385 1727204589.72374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.72377: handler run complete 51385 1727204589.72380: Evaluated conditional (False): False 51385 1727204589.72382: attempt loop complete, returning result 51385 1727204589.72384: variable 'item' from source: unknown 51385 1727204589.72500: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.010469", "end": "2024-09-24 15:03:09.654509", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-24 15:03:09.644040" } 51385 1727204589.72782: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.72785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.72834: variable 'omit' from source: magic vars 51385 1727204589.72995: variable 'ansible_distribution_major_version' from source: facts 51385 1727204589.73013: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204589.73229: variable 'type' from source: play vars 51385 1727204589.73238: variable 'state' from source: include params 51385 1727204589.73247: variable 'interface' from source: play vars 51385 1727204589.73256: variable 'current_interfaces' from source: set_fact 51385 1727204589.73273: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 51385 1727204589.73284: variable 'omit' from source: magic vars 51385 1727204589.73304: variable 'omit' from source: magic vars 51385 1727204589.73358: variable 'item' from source: unknown 51385 1727204589.73437: variable 'item' from source: unknown 51385 1727204589.73462: variable 'omit' from source: magic vars 51385 1727204589.73499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204589.73513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204589.73523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204589.73542: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204589.73554: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.73562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.73653: Set connection var ansible_pipelining to False 51385 1727204589.73755: Set connection var ansible_shell_type to sh 51385 1727204589.73777: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204589.73790: Set connection var ansible_timeout to 10 51385 1727204589.73797: Set connection var ansible_connection to ssh 51385 1727204589.73810: Set connection var ansible_shell_executable to /bin/sh 51385 1727204589.73834: variable 'ansible_shell_executable' from source: unknown 51385 1727204589.73842: variable 'ansible_connection' from source: unknown 51385 1727204589.73849: variable 'ansible_module_compression' from source: unknown 51385 1727204589.73861: variable 'ansible_shell_type' from source: unknown 51385 1727204589.73870: variable 'ansible_shell_executable' from source: unknown 51385 1727204589.73882: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204589.73892: variable 'ansible_pipelining' from source: unknown 51385 1727204589.73899: variable 'ansible_timeout' from source: unknown 51385 1727204589.73909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204589.74016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204589.74036: variable 'omit' from source: magic vars 51385 1727204589.74046: starting attempt loop 51385 1727204589.74053: running the handler 51385 1727204589.74068: _low_level_execute_command(): starting 51385 1727204589.74077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204589.74876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.74897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.74913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.74932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.74979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.74996: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.75015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.75035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.75048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.75059: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.75075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.75091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.75114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.75128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.75140: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.75154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.75242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.75260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.75281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.75392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.76933: stdout chunk (state=3): >>>/root <<< 51385 1727204589.77118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.77122: stdout chunk (state=3): >>><<< 51385 1727204589.77124: stderr chunk (state=3): >>><<< 51385 1727204589.77229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.77232: _low_level_execute_command(): starting 51385 1727204589.77237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660 `" && echo ansible-tmp-1727204589.771468-51978-42618115195660="` echo /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660 `" ) && sleep 0' 51385 1727204589.77822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.77826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.77869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204589.77874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.77876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.77974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.77977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.78049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.79892: stdout chunk (state=3): >>>ansible-tmp-1727204589.771468-51978-42618115195660=/root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660 <<< 51385 1727204589.80003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.80079: stderr chunk (state=3): >>><<< 51385 1727204589.80082: stdout chunk (state=3): >>><<< 51385 1727204589.80101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204589.771468-51978-42618115195660=/root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.80128: variable 'ansible_module_compression' from source: unknown 51385 1727204589.80171: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204589.80191: variable 'ansible_facts' from source: unknown 51385 1727204589.80258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/AnsiballZ_command.py 51385 1727204589.80394: Sending initial data 51385 1727204589.80398: Sent initial data (154 bytes) 51385 1727204589.83171: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.83182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.83192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.83208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.83252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.83393: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.83405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.83421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.83428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.83455: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.83467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.83500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.83512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.83523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.83592: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.83596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.83653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.83667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.83670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.83955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.85485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204589.85533: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204589.85588: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpd_uynnhq /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/AnsiballZ_command.py <<< 51385 1727204589.85637: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204589.86871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.87071: stderr chunk (state=3): >>><<< 51385 1727204589.87074: stdout chunk (state=3): >>><<< 51385 1727204589.87079: done transferring module to remote 51385 1727204589.87081: _low_level_execute_command(): starting 51385 1727204589.87084: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/ /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/AnsiballZ_command.py && sleep 0' 51385 1727204589.88396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204589.88409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.88421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.88436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.88480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.88494: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.88505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.88519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.88528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.88536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.88545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.88555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.88577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.88587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.88600: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.88610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.88687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.88711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.88724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.88815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204589.90583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204589.90652: stderr chunk (state=3): >>><<< 51385 1727204589.90656: stdout chunk (state=3): >>><<< 51385 1727204589.90772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204589.90779: _low_level_execute_command(): starting 51385 1727204589.90782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/AnsiballZ_command.py && sleep 0' 51385 1727204589.92147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.92162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.92239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.92245: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204589.92269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.92312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204589.92315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204589.92343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204589.92372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204589.92405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204589.92431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204589.92446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204589.92461: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204589.92501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204589.92601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204589.92627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204589.92645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204589.92808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.06121: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 15:03:10.057142", "end": "2024-09-24 15:03:10.060536", "delta": "0:00:00.003394", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204590.07289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204590.07424: stderr chunk (state=3): >>><<< 51385 1727204590.07429: stdout chunk (state=3): >>><<< 51385 1727204590.07530: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 15:03:10.057142", "end": "2024-09-24 15:03:10.060536", "delta": "0:00:00.003394", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204590.07535: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204590.07538: _low_level_execute_command(): starting 51385 1727204590.07607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204589.771468-51978-42618115195660/ > /dev/null 2>&1 && sleep 0' 51385 1727204590.08967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.09003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.09029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.09048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.09095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.09109: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.09132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.09174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.09187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.09228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.09255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.09292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.09308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.09321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.09343: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.09379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.09550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.09619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.09637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.09755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.11629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.11732: stderr chunk (state=3): >>><<< 51385 1727204590.11735: stdout chunk (state=3): >>><<< 51385 1727204590.11814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.11817: handler run complete 51385 1727204590.11978: Evaluated conditional (False): False 51385 1727204590.11982: attempt loop complete, returning result 51385 1727204590.11985: variable 'item' from source: unknown 51385 1727204590.11987: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.003394", "end": "2024-09-24 15:03:10.060536", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-24 15:03:10.057142" } 51385 1727204590.12474: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.12479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.12566: variable 'omit' from source: magic vars 51385 1727204590.12787: variable 'ansible_distribution_major_version' from source: facts 51385 1727204590.12797: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204590.13014: variable 'type' from source: play vars 51385 1727204590.13234: variable 'state' from source: include params 51385 1727204590.13254: variable 'interface' from source: play vars 51385 1727204590.13268: variable 'current_interfaces' from source: set_fact 51385 1727204590.13285: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 51385 1727204590.13308: variable 'omit' from source: magic vars 51385 1727204590.13333: variable 'omit' from source: magic vars 51385 1727204590.13392: variable 'item' from source: unknown 51385 1727204590.13494: variable 'item' from source: unknown 51385 1727204590.13535: variable 'omit' from source: magic vars 51385 1727204590.13573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204590.13594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204590.13607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204590.13646: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204590.13658: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.13672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.13777: Set connection var ansible_pipelining to False 51385 1727204590.13788: Set connection var ansible_shell_type to sh 51385 1727204590.13807: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204590.13821: Set connection var ansible_timeout to 10 51385 1727204590.13828: Set connection var ansible_connection to ssh 51385 1727204590.13887: Set connection var ansible_shell_executable to /bin/sh 51385 1727204590.13918: variable 'ansible_shell_executable' from source: unknown 51385 1727204590.13930: variable 'ansible_connection' from source: unknown 51385 1727204590.13940: variable 'ansible_module_compression' from source: unknown 51385 1727204590.13951: variable 'ansible_shell_type' from source: unknown 51385 1727204590.13961: variable 'ansible_shell_executable' from source: unknown 51385 1727204590.13978: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.13989: variable 'ansible_pipelining' from source: unknown 51385 1727204590.14001: variable 'ansible_timeout' from source: unknown 51385 1727204590.14011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.14126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204590.14141: variable 'omit' from source: magic vars 51385 1727204590.14150: starting attempt loop 51385 1727204590.14157: running the handler 51385 1727204590.14176: _low_level_execute_command(): starting 51385 1727204590.14185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204590.14992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.15007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.15022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.15046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.15399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.15411: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.15426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.15444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.15460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.15490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.15509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.15525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.15549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.15568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.15581: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.15603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.15685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.15715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.15731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.15821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.17349: stdout chunk (state=3): >>>/root <<< 51385 1727204590.17462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.17568: stderr chunk (state=3): >>><<< 51385 1727204590.17572: stdout chunk (state=3): >>><<< 51385 1727204590.17677: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.17680: _low_level_execute_command(): starting 51385 1727204590.17683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929 `" && echo ansible-tmp-1727204590.1759732-51978-28194254319929="` echo /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929 `" ) && sleep 0' 51385 1727204590.18555: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.18576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.18591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.18609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.18651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.18663: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.18681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.18698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.18709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.18720: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.18732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.18744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.18758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.18773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.18784: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.18798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.18877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.18898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.18913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.18998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.20866: stdout chunk (state=3): >>>ansible-tmp-1727204590.1759732-51978-28194254319929=/root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929 <<< 51385 1727204590.21068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.21072: stdout chunk (state=3): >>><<< 51385 1727204590.21074: stderr chunk (state=3): >>><<< 51385 1727204590.21315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204590.1759732-51978-28194254319929=/root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.21318: variable 'ansible_module_compression' from source: unknown 51385 1727204590.21321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204590.21323: variable 'ansible_facts' from source: unknown 51385 1727204590.21325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/AnsiballZ_command.py 51385 1727204590.21438: Sending initial data 51385 1727204590.21444: Sent initial data (155 bytes) 51385 1727204590.22613: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.22617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.22648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204590.22652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.22654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204590.22656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.22737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.22752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.22839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.24547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204590.24601: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204590.24654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpvd3f3jcx /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/AnsiballZ_command.py <<< 51385 1727204590.24710: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204590.26141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.26316: stderr chunk (state=3): >>><<< 51385 1727204590.26319: stdout chunk (state=3): >>><<< 51385 1727204590.26322: done transferring module to remote 51385 1727204590.26324: _low_level_execute_command(): starting 51385 1727204590.26327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/ /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/AnsiballZ_command.py && sleep 0' 51385 1727204590.27507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.27685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.27699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.27715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.27763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.27779: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.27795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.27813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.27825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.27835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.27845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.27855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.27874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.27883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.27891: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.27901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.27982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.28082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.28096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.28353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.30203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.30207: stdout chunk (state=3): >>><<< 51385 1727204590.30210: stderr chunk (state=3): >>><<< 51385 1727204590.30321: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.30325: _low_level_execute_command(): starting 51385 1727204590.30328: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/AnsiballZ_command.py && sleep 0' 51385 1727204590.31769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.31886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.31902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.31920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.31967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.31989: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.32083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.32103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.32115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.32126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.32137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.32149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.32169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.32175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.32182: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.32191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.32259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.32321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.32327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.32532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.46190: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 15:03:10.455299", "end": "2024-09-24 15:03:10.461250", "delta": "0:00:00.005951", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204590.47366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204590.47371: stdout chunk (state=3): >>><<< 51385 1727204590.47379: stderr chunk (state=3): >>><<< 51385 1727204590.47404: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 15:03:10.455299", "end": "2024-09-24 15:03:10.461250", "delta": "0:00:00.005951", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204590.47433: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204590.47437: _low_level_execute_command(): starting 51385 1727204590.47443: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204590.1759732-51978-28194254319929/ > /dev/null 2>&1 && sleep 0' 51385 1727204590.49185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.49285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.49295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.49309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.49351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.49480: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.49499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.49512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.49520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.49526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.49534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.49542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.49556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.49570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.49573: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.49585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.49655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.49673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.50088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.50163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.51987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.52011: stderr chunk (state=3): >>><<< 51385 1727204590.52015: stdout chunk (state=3): >>><<< 51385 1727204590.52033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.52036: handler run complete 51385 1727204590.52060: Evaluated conditional (False): False 51385 1727204590.52074: attempt loop complete, returning result 51385 1727204590.52093: variable 'item' from source: unknown 51385 1727204590.52183: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.005951", "end": "2024-09-24 15:03:10.461250", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-24 15:03:10.455299" } 51385 1727204590.52308: dumping result to json 51385 1727204590.52311: done dumping result, returning 51385 1727204590.52313: done running TaskExecutor() for managed-node1/TASK: Create veth interface lsr101 [0affcd87-79f5-6b1f-5706-00000000021f] 51385 1727204590.52315: sending task result for task 0affcd87-79f5-6b1f-5706-00000000021f 51385 1727204590.52359: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000021f 51385 1727204590.52362: WORKER PROCESS EXITING 51385 1727204590.52419: no more pending results, returning what we have 51385 1727204590.52422: results queue empty 51385 1727204590.52423: checking for any_errors_fatal 51385 1727204590.52428: done checking for any_errors_fatal 51385 1727204590.52428: checking for max_fail_percentage 51385 1727204590.52430: done checking for max_fail_percentage 51385 1727204590.52431: checking to see if all hosts have failed and the running result is not ok 51385 1727204590.52432: done checking to see if all hosts have failed 51385 1727204590.52432: getting the remaining hosts for this loop 51385 1727204590.52434: done getting the remaining hosts for this loop 51385 1727204590.52437: getting the next task for host managed-node1 51385 1727204590.52444: done getting next task for host managed-node1 51385 1727204590.52446: ^ task is: TASK: Set up veth as managed by NetworkManager 51385 1727204590.52450: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204590.52453: getting variables 51385 1727204590.52454: in VariableManager get_vars() 51385 1727204590.52495: Calling all_inventory to load vars for managed-node1 51385 1727204590.52498: Calling groups_inventory to load vars for managed-node1 51385 1727204590.52500: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204590.52510: Calling all_plugins_play to load vars for managed-node1 51385 1727204590.52513: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204590.52515: Calling groups_plugins_play to load vars for managed-node1 51385 1727204590.52709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204590.53451: done with get_vars() 51385 1727204590.53462: done getting variables 51385 1727204590.53524: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:03:10 -0400 (0:00:01.258) 0:00:08.940 ***** 51385 1727204590.53675: entering _queue_task() for managed-node1/command 51385 1727204590.54334: worker is 1 (out of 1 available) 51385 1727204590.54349: exiting _queue_task() for managed-node1/command 51385 1727204590.54361: done queuing things up, now waiting for results queue to drain 51385 1727204590.54363: waiting for pending results... 51385 1727204590.55259: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 51385 1727204590.55637: in run() - task 0affcd87-79f5-6b1f-5706-000000000220 51385 1727204590.55660: variable 'ansible_search_path' from source: unknown 51385 1727204590.55670: variable 'ansible_search_path' from source: unknown 51385 1727204590.55711: calling self._execute() 51385 1727204590.56055: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.56076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.56093: variable 'omit' from source: magic vars 51385 1727204590.57041: variable 'ansible_distribution_major_version' from source: facts 51385 1727204590.57069: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204590.57462: variable 'type' from source: play vars 51385 1727204590.57484: variable 'state' from source: include params 51385 1727204590.57494: Evaluated conditional (type == 'veth' and state == 'present'): True 51385 1727204590.57505: variable 'omit' from source: magic vars 51385 1727204590.57546: variable 'omit' from source: magic vars 51385 1727204590.57761: variable 'interface' from source: play vars 51385 1727204590.57815: variable 'omit' from source: magic vars 51385 1727204590.57938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204590.57980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204590.58047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204590.58136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204590.58154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204590.58254: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204590.58263: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.58273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.58514: Set connection var ansible_pipelining to False 51385 1727204590.58658: Set connection var ansible_shell_type to sh 51385 1727204590.58683: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204590.58697: Set connection var ansible_timeout to 10 51385 1727204590.58704: Set connection var ansible_connection to ssh 51385 1727204590.58714: Set connection var ansible_shell_executable to /bin/sh 51385 1727204590.58742: variable 'ansible_shell_executable' from source: unknown 51385 1727204590.58874: variable 'ansible_connection' from source: unknown 51385 1727204590.58892: variable 'ansible_module_compression' from source: unknown 51385 1727204590.58900: variable 'ansible_shell_type' from source: unknown 51385 1727204590.58907: variable 'ansible_shell_executable' from source: unknown 51385 1727204590.58914: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204590.58921: variable 'ansible_pipelining' from source: unknown 51385 1727204590.58927: variable 'ansible_timeout' from source: unknown 51385 1727204590.58934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204590.59183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204590.59323: variable 'omit' from source: magic vars 51385 1727204590.59337: starting attempt loop 51385 1727204590.59344: running the handler 51385 1727204590.59365: _low_level_execute_command(): starting 51385 1727204590.59379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204590.60773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.60779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.60802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.60807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.60885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.60888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.61061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.61131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.62675: stdout chunk (state=3): >>>/root <<< 51385 1727204590.62849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.62853: stdout chunk (state=3): >>><<< 51385 1727204590.62876: stderr chunk (state=3): >>><<< 51385 1727204590.62907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.62925: _low_level_execute_command(): starting 51385 1727204590.62933: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915 `" && echo ansible-tmp-1727204590.6290758-52210-64860600744915="` echo /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915 `" ) && sleep 0' 51385 1727204590.64743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.64758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.64776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.64821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.64955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.64974: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.64994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.65017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.65168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.65182: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.65193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.65362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.65387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.65402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.65415: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.65430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.65617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.65636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.65651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.65844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.67626: stdout chunk (state=3): >>>ansible-tmp-1727204590.6290758-52210-64860600744915=/root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915 <<< 51385 1727204590.67799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.67803: stdout chunk (state=3): >>><<< 51385 1727204590.67805: stderr chunk (state=3): >>><<< 51385 1727204590.67977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204590.6290758-52210-64860600744915=/root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.67981: variable 'ansible_module_compression' from source: unknown 51385 1727204590.67983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204590.67985: variable 'ansible_facts' from source: unknown 51385 1727204590.68067: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/AnsiballZ_command.py 51385 1727204590.68623: Sending initial data 51385 1727204590.68626: Sent initial data (155 bytes) 51385 1727204590.70858: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.70863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.70897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.70900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.70962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.70967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.70970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.71037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.72781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204590.72870: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204590.72952: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmp7q63kl8z /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/AnsiballZ_command.py <<< 51385 1727204590.73822: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204590.74750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.75122: stderr chunk (state=3): >>><<< 51385 1727204590.75152: stdout chunk (state=3): >>><<< 51385 1727204590.75221: done transferring module to remote 51385 1727204590.75250: _low_level_execute_command(): starting 51385 1727204590.75314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/ /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/AnsiballZ_command.py && sleep 0' 51385 1727204590.76913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.76929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.76945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.76970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.77023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.77035: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.77055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.77079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.77094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.77109: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.77121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.77133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.77147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.77158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.77174: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.77188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.77277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.77308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.77332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.77424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.79185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204590.79547: stderr chunk (state=3): >>><<< 51385 1727204590.79568: stdout chunk (state=3): >>><<< 51385 1727204590.79642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204590.79658: _low_level_execute_command(): starting 51385 1727204590.79666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/AnsiballZ_command.py && sleep 0' 51385 1727204590.81443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204590.81459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.81480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.81503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.81556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.81572: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204590.81586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.81607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204590.81622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204590.81641: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204590.81654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204590.81674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204590.81690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204590.81707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204590.81719: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204590.81738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204590.81843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204590.81878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204590.81894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204590.81997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204590.97042: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 15:03:10.949609", "end": "2024-09-24 15:03:10.969516", "delta": "0:00:00.019907", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 51385 1727204590.97047: stdout chunk (state=3): >>> <<< 51385 1727204590.98344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204590.98347: stdout chunk (state=3): >>><<< 51385 1727204590.98350: stderr chunk (state=3): >>><<< 51385 1727204590.98501: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 15:03:10.949609", "end": "2024-09-24 15:03:10.969516", "delta": "0:00:00.019907", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204590.98506: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204590.98513: _low_level_execute_command(): starting 51385 1727204590.98516: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204590.6290758-52210-64860600744915/ > /dev/null 2>&1 && sleep 0' 51385 1727204591.00587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.00681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.00784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.00799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.00837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.00845: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.00855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.00874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.00881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.00892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.00904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.00916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.00929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.00940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.00949: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.00965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.01041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.01120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.01138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.01223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.03081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.03085: stdout chunk (state=3): >>><<< 51385 1727204591.03087: stderr chunk (state=3): >>><<< 51385 1727204591.03169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204591.03173: handler run complete 51385 1727204591.03175: Evaluated conditional (False): False 51385 1727204591.03177: attempt loop complete, returning result 51385 1727204591.03179: _execute() done 51385 1727204591.03182: dumping result to json 51385 1727204591.03184: done dumping result, returning 51385 1727204591.03185: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-6b1f-5706-000000000220] 51385 1727204591.03188: sending task result for task 0affcd87-79f5-6b1f-5706-000000000220 51385 1727204591.03346: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000220 51385 1727204591.03350: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.019907", "end": "2024-09-24 15:03:10.969516", "rc": 0, "start": "2024-09-24 15:03:10.949609" } 51385 1727204591.03425: no more pending results, returning what we have 51385 1727204591.03428: results queue empty 51385 1727204591.03429: checking for any_errors_fatal 51385 1727204591.03439: done checking for any_errors_fatal 51385 1727204591.03440: checking for max_fail_percentage 51385 1727204591.03441: done checking for max_fail_percentage 51385 1727204591.03442: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.03443: done checking to see if all hosts have failed 51385 1727204591.03444: getting the remaining hosts for this loop 51385 1727204591.03445: done getting the remaining hosts for this loop 51385 1727204591.03449: getting the next task for host managed-node1 51385 1727204591.03455: done getting next task for host managed-node1 51385 1727204591.03457: ^ task is: TASK: Delete veth interface {{ interface }} 51385 1727204591.03461: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.03467: getting variables 51385 1727204591.03469: in VariableManager get_vars() 51385 1727204591.03510: Calling all_inventory to load vars for managed-node1 51385 1727204591.03512: Calling groups_inventory to load vars for managed-node1 51385 1727204591.03515: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.03526: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.03528: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.03531: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.03719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.03954: done with get_vars() 51385 1727204591.03969: done getting variables 51385 1727204591.04148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204591.04391: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.507) 0:00:09.449 ***** 51385 1727204591.04541: entering _queue_task() for managed-node1/command 51385 1727204591.05067: worker is 1 (out of 1 available) 51385 1727204591.05094: exiting _queue_task() for managed-node1/command 51385 1727204591.05106: done queuing things up, now waiting for results queue to drain 51385 1727204591.05107: waiting for pending results... 51385 1727204591.06349: running TaskExecutor() for managed-node1/TASK: Delete veth interface lsr101 51385 1727204591.06468: in run() - task 0affcd87-79f5-6b1f-5706-000000000221 51385 1727204591.06602: variable 'ansible_search_path' from source: unknown 51385 1727204591.06612: variable 'ansible_search_path' from source: unknown 51385 1727204591.06653: calling self._execute() 51385 1727204591.06779: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.06893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.06919: variable 'omit' from source: magic vars 51385 1727204591.07663: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.07797: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.08241: variable 'type' from source: play vars 51385 1727204591.08251: variable 'state' from source: include params 51385 1727204591.08258: variable 'interface' from source: play vars 51385 1727204591.08270: variable 'current_interfaces' from source: set_fact 51385 1727204591.08282: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 51385 1727204591.08328: when evaluation is False, skipping this task 51385 1727204591.08336: _execute() done 51385 1727204591.08344: dumping result to json 51385 1727204591.08352: done dumping result, returning 51385 1727204591.08366: done running TaskExecutor() for managed-node1/TASK: Delete veth interface lsr101 [0affcd87-79f5-6b1f-5706-000000000221] 51385 1727204591.08377: sending task result for task 0affcd87-79f5-6b1f-5706-000000000221 51385 1727204591.08535: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000221 51385 1727204591.08545: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204591.08602: no more pending results, returning what we have 51385 1727204591.08607: results queue empty 51385 1727204591.08608: checking for any_errors_fatal 51385 1727204591.08615: done checking for any_errors_fatal 51385 1727204591.08616: checking for max_fail_percentage 51385 1727204591.08617: done checking for max_fail_percentage 51385 1727204591.08618: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.08619: done checking to see if all hosts have failed 51385 1727204591.08620: getting the remaining hosts for this loop 51385 1727204591.08621: done getting the remaining hosts for this loop 51385 1727204591.08625: getting the next task for host managed-node1 51385 1727204591.08632: done getting next task for host managed-node1 51385 1727204591.08634: ^ task is: TASK: Create dummy interface {{ interface }} 51385 1727204591.08638: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.08641: getting variables 51385 1727204591.08643: in VariableManager get_vars() 51385 1727204591.08693: Calling all_inventory to load vars for managed-node1 51385 1727204591.08696: Calling groups_inventory to load vars for managed-node1 51385 1727204591.08699: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.08713: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.08716: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.08719: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.08914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.09315: done with get_vars() 51385 1727204591.09327: done getting variables 51385 1727204591.09388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204591.09619: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.052) 0:00:09.501 ***** 51385 1727204591.09767: entering _queue_task() for managed-node1/command 51385 1727204591.10267: worker is 1 (out of 1 available) 51385 1727204591.10279: exiting _queue_task() for managed-node1/command 51385 1727204591.10406: done queuing things up, now waiting for results queue to drain 51385 1727204591.10407: waiting for pending results... 51385 1727204591.11196: running TaskExecutor() for managed-node1/TASK: Create dummy interface lsr101 51385 1727204591.11420: in run() - task 0affcd87-79f5-6b1f-5706-000000000222 51385 1727204591.11441: variable 'ansible_search_path' from source: unknown 51385 1727204591.11448: variable 'ansible_search_path' from source: unknown 51385 1727204591.11496: calling self._execute() 51385 1727204591.11705: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.11716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.11729: variable 'omit' from source: magic vars 51385 1727204591.12490: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.12564: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.13051: variable 'type' from source: play vars 51385 1727204591.13079: variable 'state' from source: include params 51385 1727204591.13088: variable 'interface' from source: play vars 51385 1727204591.13110: variable 'current_interfaces' from source: set_fact 51385 1727204591.13146: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 51385 1727204591.13169: when evaluation is False, skipping this task 51385 1727204591.13975: _execute() done 51385 1727204591.13983: dumping result to json 51385 1727204591.13991: done dumping result, returning 51385 1727204591.14001: done running TaskExecutor() for managed-node1/TASK: Create dummy interface lsr101 [0affcd87-79f5-6b1f-5706-000000000222] 51385 1727204591.14012: sending task result for task 0affcd87-79f5-6b1f-5706-000000000222 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204591.14190: no more pending results, returning what we have 51385 1727204591.14194: results queue empty 51385 1727204591.14195: checking for any_errors_fatal 51385 1727204591.14201: done checking for any_errors_fatal 51385 1727204591.14201: checking for max_fail_percentage 51385 1727204591.14203: done checking for max_fail_percentage 51385 1727204591.14204: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.14205: done checking to see if all hosts have failed 51385 1727204591.14205: getting the remaining hosts for this loop 51385 1727204591.14207: done getting the remaining hosts for this loop 51385 1727204591.14210: getting the next task for host managed-node1 51385 1727204591.14217: done getting next task for host managed-node1 51385 1727204591.14219: ^ task is: TASK: Delete dummy interface {{ interface }} 51385 1727204591.14222: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.14227: getting variables 51385 1727204591.14229: in VariableManager get_vars() 51385 1727204591.14272: Calling all_inventory to load vars for managed-node1 51385 1727204591.14274: Calling groups_inventory to load vars for managed-node1 51385 1727204591.14277: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.14290: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.14292: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.14295: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.14476: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000222 51385 1727204591.14480: WORKER PROCESS EXITING 51385 1727204591.14509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.14726: done with get_vars() 51385 1727204591.14737: done getting variables 51385 1727204591.14922: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204591.15157: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.054) 0:00:09.555 ***** 51385 1727204591.15191: entering _queue_task() for managed-node1/command 51385 1727204591.15793: worker is 1 (out of 1 available) 51385 1727204591.15807: exiting _queue_task() for managed-node1/command 51385 1727204591.15819: done queuing things up, now waiting for results queue to drain 51385 1727204591.15820: waiting for pending results... 51385 1727204591.16617: running TaskExecutor() for managed-node1/TASK: Delete dummy interface lsr101 51385 1727204591.16725: in run() - task 0affcd87-79f5-6b1f-5706-000000000223 51385 1727204591.16916: variable 'ansible_search_path' from source: unknown 51385 1727204591.16924: variable 'ansible_search_path' from source: unknown 51385 1727204591.16969: calling self._execute() 51385 1727204591.17067: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.17079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.17093: variable 'omit' from source: magic vars 51385 1727204591.17465: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.17486: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.17702: variable 'type' from source: play vars 51385 1727204591.17715: variable 'state' from source: include params 51385 1727204591.17724: variable 'interface' from source: play vars 51385 1727204591.17732: variable 'current_interfaces' from source: set_fact 51385 1727204591.17745: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 51385 1727204591.17752: when evaluation is False, skipping this task 51385 1727204591.17786: _execute() done 51385 1727204591.17794: dumping result to json 51385 1727204591.17804: done dumping result, returning 51385 1727204591.17814: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface lsr101 [0affcd87-79f5-6b1f-5706-000000000223] 51385 1727204591.17824: sending task result for task 0affcd87-79f5-6b1f-5706-000000000223 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204591.17978: no more pending results, returning what we have 51385 1727204591.17982: results queue empty 51385 1727204591.17983: checking for any_errors_fatal 51385 1727204591.17990: done checking for any_errors_fatal 51385 1727204591.17991: checking for max_fail_percentage 51385 1727204591.17992: done checking for max_fail_percentage 51385 1727204591.17993: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.17994: done checking to see if all hosts have failed 51385 1727204591.17995: getting the remaining hosts for this loop 51385 1727204591.17996: done getting the remaining hosts for this loop 51385 1727204591.18000: getting the next task for host managed-node1 51385 1727204591.18006: done getting next task for host managed-node1 51385 1727204591.18009: ^ task is: TASK: Create tap interface {{ interface }} 51385 1727204591.18012: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.18015: getting variables 51385 1727204591.18017: in VariableManager get_vars() 51385 1727204591.18059: Calling all_inventory to load vars for managed-node1 51385 1727204591.18062: Calling groups_inventory to load vars for managed-node1 51385 1727204591.18065: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.18073: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000223 51385 1727204591.18076: WORKER PROCESS EXITING 51385 1727204591.18089: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.18092: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.18096: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.18398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.18741: done with get_vars() 51385 1727204591.18752: done getting variables 51385 1727204591.18858: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204591.18981: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.038) 0:00:09.594 ***** 51385 1727204591.19012: entering _queue_task() for managed-node1/command 51385 1727204591.19285: worker is 1 (out of 1 available) 51385 1727204591.19297: exiting _queue_task() for managed-node1/command 51385 1727204591.19309: done queuing things up, now waiting for results queue to drain 51385 1727204591.19310: waiting for pending results... 51385 1727204591.19577: running TaskExecutor() for managed-node1/TASK: Create tap interface lsr101 51385 1727204591.19693: in run() - task 0affcd87-79f5-6b1f-5706-000000000224 51385 1727204591.19716: variable 'ansible_search_path' from source: unknown 51385 1727204591.19724: variable 'ansible_search_path' from source: unknown 51385 1727204591.19771: calling self._execute() 51385 1727204591.19864: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.20530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.20545: variable 'omit' from source: magic vars 51385 1727204591.21250: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.21336: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.21574: variable 'type' from source: play vars 51385 1727204591.21588: variable 'state' from source: include params 51385 1727204591.21596: variable 'interface' from source: play vars 51385 1727204591.21604: variable 'current_interfaces' from source: set_fact 51385 1727204591.21620: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 51385 1727204591.21627: when evaluation is False, skipping this task 51385 1727204591.21634: _execute() done 51385 1727204591.21642: dumping result to json 51385 1727204591.21650: done dumping result, returning 51385 1727204591.21662: done running TaskExecutor() for managed-node1/TASK: Create tap interface lsr101 [0affcd87-79f5-6b1f-5706-000000000224] 51385 1727204591.21676: sending task result for task 0affcd87-79f5-6b1f-5706-000000000224 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204591.21831: no more pending results, returning what we have 51385 1727204591.21836: results queue empty 51385 1727204591.21837: checking for any_errors_fatal 51385 1727204591.21844: done checking for any_errors_fatal 51385 1727204591.21845: checking for max_fail_percentage 51385 1727204591.21847: done checking for max_fail_percentage 51385 1727204591.21848: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.21849: done checking to see if all hosts have failed 51385 1727204591.21850: getting the remaining hosts for this loop 51385 1727204591.21851: done getting the remaining hosts for this loop 51385 1727204591.21856: getting the next task for host managed-node1 51385 1727204591.21868: done getting next task for host managed-node1 51385 1727204591.21872: ^ task is: TASK: Delete tap interface {{ interface }} 51385 1727204591.21876: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.21881: getting variables 51385 1727204591.21883: in VariableManager get_vars() 51385 1727204591.21931: Calling all_inventory to load vars for managed-node1 51385 1727204591.21934: Calling groups_inventory to load vars for managed-node1 51385 1727204591.21936: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.21951: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.21953: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.21956: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.22186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.22427: done with get_vars() 51385 1727204591.22441: done getting variables 51385 1727204591.22699: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000224 51385 1727204591.22703: WORKER PROCESS EXITING 51385 1727204591.22744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204591.22878: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.038) 0:00:09.633 ***** 51385 1727204591.22993: entering _queue_task() for managed-node1/command 51385 1727204591.23275: worker is 1 (out of 1 available) 51385 1727204591.23288: exiting _queue_task() for managed-node1/command 51385 1727204591.23301: done queuing things up, now waiting for results queue to drain 51385 1727204591.23302: waiting for pending results... 51385 1727204591.23574: running TaskExecutor() for managed-node1/TASK: Delete tap interface lsr101 51385 1727204591.23687: in run() - task 0affcd87-79f5-6b1f-5706-000000000225 51385 1727204591.23710: variable 'ansible_search_path' from source: unknown 51385 1727204591.23717: variable 'ansible_search_path' from source: unknown 51385 1727204591.23759: calling self._execute() 51385 1727204591.23858: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.23874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.23887: variable 'omit' from source: magic vars 51385 1727204591.25103: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.25186: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.25432: variable 'type' from source: play vars 51385 1727204591.25444: variable 'state' from source: include params 51385 1727204591.25453: variable 'interface' from source: play vars 51385 1727204591.25462: variable 'current_interfaces' from source: set_fact 51385 1727204591.25476: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 51385 1727204591.25483: when evaluation is False, skipping this task 51385 1727204591.25489: _execute() done 51385 1727204591.25495: dumping result to json 51385 1727204591.25502: done dumping result, returning 51385 1727204591.25510: done running TaskExecutor() for managed-node1/TASK: Delete tap interface lsr101 [0affcd87-79f5-6b1f-5706-000000000225] 51385 1727204591.25525: sending task result for task 0affcd87-79f5-6b1f-5706-000000000225 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204591.25685: no more pending results, returning what we have 51385 1727204591.25689: results queue empty 51385 1727204591.25690: checking for any_errors_fatal 51385 1727204591.25698: done checking for any_errors_fatal 51385 1727204591.25699: checking for max_fail_percentage 51385 1727204591.25700: done checking for max_fail_percentage 51385 1727204591.25701: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.25702: done checking to see if all hosts have failed 51385 1727204591.25703: getting the remaining hosts for this loop 51385 1727204591.25705: done getting the remaining hosts for this loop 51385 1727204591.25709: getting the next task for host managed-node1 51385 1727204591.25718: done getting next task for host managed-node1 51385 1727204591.25721: ^ task is: TASK: Include the task 'assert_device_present.yml' 51385 1727204591.25723: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.25727: getting variables 51385 1727204591.25729: in VariableManager get_vars() 51385 1727204591.25779: Calling all_inventory to load vars for managed-node1 51385 1727204591.25783: Calling groups_inventory to load vars for managed-node1 51385 1727204591.25785: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.25800: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.25803: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.25806: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.26093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.26318: done with get_vars() 51385 1727204591.26330: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.035) 0:00:09.668 ***** 51385 1727204591.26455: entering _queue_task() for managed-node1/include_tasks 51385 1727204591.26722: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000225 51385 1727204591.26726: WORKER PROCESS EXITING 51385 1727204591.26974: worker is 1 (out of 1 available) 51385 1727204591.26986: exiting _queue_task() for managed-node1/include_tasks 51385 1727204591.26998: done queuing things up, now waiting for results queue to drain 51385 1727204591.26999: waiting for pending results... 51385 1727204591.27274: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 51385 1727204591.27379: in run() - task 0affcd87-79f5-6b1f-5706-00000000000d 51385 1727204591.27399: variable 'ansible_search_path' from source: unknown 51385 1727204591.27442: calling self._execute() 51385 1727204591.27538: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.27553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.27575: variable 'omit' from source: magic vars 51385 1727204591.28269: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.28615: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.28626: _execute() done 51385 1727204591.28632: dumping result to json 51385 1727204591.28715: done dumping result, returning 51385 1727204591.28726: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-6b1f-5706-00000000000d] 51385 1727204591.28737: sending task result for task 0affcd87-79f5-6b1f-5706-00000000000d 51385 1727204591.28871: no more pending results, returning what we have 51385 1727204591.28876: in VariableManager get_vars() 51385 1727204591.28924: Calling all_inventory to load vars for managed-node1 51385 1727204591.28927: Calling groups_inventory to load vars for managed-node1 51385 1727204591.28930: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.28945: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.28948: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.28950: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.29167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.29396: done with get_vars() 51385 1727204591.29404: variable 'ansible_search_path' from source: unknown 51385 1727204591.29419: we have included files to process 51385 1727204591.29420: generating all_blocks data 51385 1727204591.29422: done generating all_blocks data 51385 1727204591.29427: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204591.29428: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204591.29431: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204591.29825: in VariableManager get_vars() 51385 1727204591.29847: done with get_vars() 51385 1727204591.30133: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000000d 51385 1727204591.30136: WORKER PROCESS EXITING 51385 1727204591.30249: done processing included file 51385 1727204591.30251: iterating over new_blocks loaded from include file 51385 1727204591.30253: in VariableManager get_vars() 51385 1727204591.30278: done with get_vars() 51385 1727204591.30280: filtering new block on tags 51385 1727204591.30298: done filtering new block on tags 51385 1727204591.30300: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 51385 1727204591.30305: extending task lists for all hosts with included blocks 51385 1727204591.33878: done extending task lists 51385 1727204591.33880: done processing included files 51385 1727204591.33881: results queue empty 51385 1727204591.33882: checking for any_errors_fatal 51385 1727204591.33885: done checking for any_errors_fatal 51385 1727204591.33886: checking for max_fail_percentage 51385 1727204591.33887: done checking for max_fail_percentage 51385 1727204591.33888: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.33889: done checking to see if all hosts have failed 51385 1727204591.33890: getting the remaining hosts for this loop 51385 1727204591.33891: done getting the remaining hosts for this loop 51385 1727204591.33893: getting the next task for host managed-node1 51385 1727204591.33897: done getting next task for host managed-node1 51385 1727204591.33900: ^ task is: TASK: Include the task 'get_interface_stat.yml' 51385 1727204591.33902: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.33905: getting variables 51385 1727204591.33906: in VariableManager get_vars() 51385 1727204591.33924: Calling all_inventory to load vars for managed-node1 51385 1727204591.33926: Calling groups_inventory to load vars for managed-node1 51385 1727204591.33928: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.33934: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.33937: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.33940: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.34128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.34344: done with get_vars() 51385 1727204591.34355: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.079) 0:00:09.748 ***** 51385 1727204591.34436: entering _queue_task() for managed-node1/include_tasks 51385 1727204591.34766: worker is 1 (out of 1 available) 51385 1727204591.34780: exiting _queue_task() for managed-node1/include_tasks 51385 1727204591.34794: done queuing things up, now waiting for results queue to drain 51385 1727204591.34795: waiting for pending results... 51385 1727204591.35076: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 51385 1727204591.35193: in run() - task 0affcd87-79f5-6b1f-5706-00000000038b 51385 1727204591.35214: variable 'ansible_search_path' from source: unknown 51385 1727204591.35221: variable 'ansible_search_path' from source: unknown 51385 1727204591.35270: calling self._execute() 51385 1727204591.35363: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.35377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.35392: variable 'omit' from source: magic vars 51385 1727204591.35767: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.35794: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.35806: _execute() done 51385 1727204591.35815: dumping result to json 51385 1727204591.35823: done dumping result, returning 51385 1727204591.35832: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-6b1f-5706-00000000038b] 51385 1727204591.35843: sending task result for task 0affcd87-79f5-6b1f-5706-00000000038b 51385 1727204591.35967: no more pending results, returning what we have 51385 1727204591.35974: in VariableManager get_vars() 51385 1727204591.36021: Calling all_inventory to load vars for managed-node1 51385 1727204591.36024: Calling groups_inventory to load vars for managed-node1 51385 1727204591.36027: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.36041: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.36044: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.36047: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.36293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.36887: done with get_vars() 51385 1727204591.36895: variable 'ansible_search_path' from source: unknown 51385 1727204591.36896: variable 'ansible_search_path' from source: unknown 51385 1727204591.36935: we have included files to process 51385 1727204591.36936: generating all_blocks data 51385 1727204591.36938: done generating all_blocks data 51385 1727204591.36939: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204591.36940: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204591.36942: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204591.37920: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000038b 51385 1727204591.37924: WORKER PROCESS EXITING 51385 1727204591.38052: done processing included file 51385 1727204591.38054: iterating over new_blocks loaded from include file 51385 1727204591.38056: in VariableManager get_vars() 51385 1727204591.38082: done with get_vars() 51385 1727204591.38084: filtering new block on tags 51385 1727204591.38101: done filtering new block on tags 51385 1727204591.38103: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 51385 1727204591.38109: extending task lists for all hosts with included blocks 51385 1727204591.38223: done extending task lists 51385 1727204591.38224: done processing included files 51385 1727204591.38225: results queue empty 51385 1727204591.38226: checking for any_errors_fatal 51385 1727204591.38230: done checking for any_errors_fatal 51385 1727204591.38231: checking for max_fail_percentage 51385 1727204591.38232: done checking for max_fail_percentage 51385 1727204591.38233: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.38234: done checking to see if all hosts have failed 51385 1727204591.38234: getting the remaining hosts for this loop 51385 1727204591.38236: done getting the remaining hosts for this loop 51385 1727204591.38238: getting the next task for host managed-node1 51385 1727204591.38242: done getting next task for host managed-node1 51385 1727204591.38244: ^ task is: TASK: Get stat for interface {{ interface }} 51385 1727204591.38247: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.38249: getting variables 51385 1727204591.38251: in VariableManager get_vars() 51385 1727204591.38268: Calling all_inventory to load vars for managed-node1 51385 1727204591.38271: Calling groups_inventory to load vars for managed-node1 51385 1727204591.38273: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.38280: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.38283: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.38286: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.38447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.38677: done with get_vars() 51385 1727204591.38686: done getting variables 51385 1727204591.38866: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.044) 0:00:09.792 ***** 51385 1727204591.38895: entering _queue_task() for managed-node1/stat 51385 1727204591.39176: worker is 1 (out of 1 available) 51385 1727204591.39188: exiting _queue_task() for managed-node1/stat 51385 1727204591.39200: done queuing things up, now waiting for results queue to drain 51385 1727204591.39201: waiting for pending results... 51385 1727204591.39470: running TaskExecutor() for managed-node1/TASK: Get stat for interface lsr101 51385 1727204591.39605: in run() - task 0affcd87-79f5-6b1f-5706-0000000004a4 51385 1727204591.39622: variable 'ansible_search_path' from source: unknown 51385 1727204591.39629: variable 'ansible_search_path' from source: unknown 51385 1727204591.39678: calling self._execute() 51385 1727204591.39773: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.39786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.39801: variable 'omit' from source: magic vars 51385 1727204591.40257: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.40279: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.40289: variable 'omit' from source: magic vars 51385 1727204591.40340: variable 'omit' from source: magic vars 51385 1727204591.40449: variable 'interface' from source: play vars 51385 1727204591.40480: variable 'omit' from source: magic vars 51385 1727204591.40529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204591.40576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204591.40604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204591.40629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.40644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.40681: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204591.40695: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.40704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.40818: Set connection var ansible_pipelining to False 51385 1727204591.40827: Set connection var ansible_shell_type to sh 51385 1727204591.40848: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204591.40866: Set connection var ansible_timeout to 10 51385 1727204591.40875: Set connection var ansible_connection to ssh 51385 1727204591.40884: Set connection var ansible_shell_executable to /bin/sh 51385 1727204591.41558: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.41572: variable 'ansible_connection' from source: unknown 51385 1727204591.41580: variable 'ansible_module_compression' from source: unknown 51385 1727204591.41586: variable 'ansible_shell_type' from source: unknown 51385 1727204591.41670: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.41678: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.41686: variable 'ansible_pipelining' from source: unknown 51385 1727204591.41692: variable 'ansible_timeout' from source: unknown 51385 1727204591.41700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.42125: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204591.42140: variable 'omit' from source: magic vars 51385 1727204591.42150: starting attempt loop 51385 1727204591.42156: running the handler 51385 1727204591.42179: _low_level_execute_command(): starting 51385 1727204591.42190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204591.44117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.44122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.44148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.44151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.44154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.44340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.44344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.44351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.44505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.45999: stdout chunk (state=3): >>>/root <<< 51385 1727204591.46180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.46183: stdout chunk (state=3): >>><<< 51385 1727204591.46193: stderr chunk (state=3): >>><<< 51385 1727204591.46222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204591.46234: _low_level_execute_command(): starting 51385 1727204591.46243: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495 `" && echo ansible-tmp-1727204591.462206-52287-12146569533495="` echo /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495 `" ) && sleep 0' 51385 1727204591.47697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.47851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.47855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.47874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.47917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.47928: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.47931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.47933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.47947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.47956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.47979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.47982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.47990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.48001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.48025: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.48028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.48106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.48120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.48133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.48217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.50116: stdout chunk (state=3): >>>ansible-tmp-1727204591.462206-52287-12146569533495=/root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495 <<< 51385 1727204591.50269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.50273: stderr chunk (state=3): >>><<< 51385 1727204591.50275: stdout chunk (state=3): >>><<< 51385 1727204591.50298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204591.462206-52287-12146569533495=/root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204591.50348: variable 'ansible_module_compression' from source: unknown 51385 1727204591.50412: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 51385 1727204591.50448: variable 'ansible_facts' from source: unknown 51385 1727204591.50532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/AnsiballZ_stat.py 51385 1727204591.51076: Sending initial data 51385 1727204591.51079: Sent initial data (151 bytes) 51385 1727204591.52144: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.52153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.52166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.52180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.52229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.52236: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.52246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.52263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.52269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.52276: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.52284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.52293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.52311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.52327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.52334: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.52342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.52416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.52443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.52456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.52542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.54252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204591.54299: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204591.54353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpfm8liuzb /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/AnsiballZ_stat.py <<< 51385 1727204591.54403: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204591.55599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.55689: stderr chunk (state=3): >>><<< 51385 1727204591.55692: stdout chunk (state=3): >>><<< 51385 1727204591.55715: done transferring module to remote 51385 1727204591.55726: _low_level_execute_command(): starting 51385 1727204591.55731: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/ /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/AnsiballZ_stat.py && sleep 0' 51385 1727204591.56414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.56422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.56432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.56446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.56503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.56510: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.56520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.56533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.56541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.56548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.56555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.56577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.56593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.56601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.56608: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.56617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.56716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.56731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.56753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.56950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.58551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.58678: stderr chunk (state=3): >>><<< 51385 1727204591.58681: stdout chunk (state=3): >>><<< 51385 1727204591.58684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204591.58686: _low_level_execute_command(): starting 51385 1727204591.58689: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/AnsiballZ_stat.py && sleep 0' 51385 1727204591.59300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.59309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.59319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.59366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.59562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.59574: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.59585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.59598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.59606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.59615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.59620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.59629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.59641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.59667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.59670: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.59673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.59737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.59754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.59767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.59863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.72810: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30850, "dev": 21, "nlink": 1, "atime": 1727204589.648942, "mtime": 1727204589.648942, "ctime": 1727204589.648942, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 51385 1727204591.73781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204591.73785: stdout chunk (state=3): >>><<< 51385 1727204591.73793: stderr chunk (state=3): >>><<< 51385 1727204591.73811: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30850, "dev": 21, "nlink": 1, "atime": 1727204589.648942, "mtime": 1727204589.648942, "ctime": 1727204589.648942, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204591.73870: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204591.73884: _low_level_execute_command(): starting 51385 1727204591.73890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204591.462206-52287-12146569533495/ > /dev/null 2>&1 && sleep 0' 51385 1727204591.74585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.74593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.74604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.74622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.74667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.74675: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.74685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.74699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.74706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.74713: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.74722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.74735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.74753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.74763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.74771: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.74780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.74865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.74880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.74892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.74986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204591.76718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204591.76768: stderr chunk (state=3): >>><<< 51385 1727204591.76771: stdout chunk (state=3): >>><<< 51385 1727204591.76787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204591.76795: handler run complete 51385 1727204591.76827: attempt loop complete, returning result 51385 1727204591.76830: _execute() done 51385 1727204591.76833: dumping result to json 51385 1727204591.76838: done dumping result, returning 51385 1727204591.76845: done running TaskExecutor() for managed-node1/TASK: Get stat for interface lsr101 [0affcd87-79f5-6b1f-5706-0000000004a4] 51385 1727204591.76850: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004a4 51385 1727204591.76957: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004a4 51385 1727204591.76960: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204589.648942, "block_size": 4096, "blocks": 0, "ctime": 1727204589.648942, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30850, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1727204589.648942, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 51385 1727204591.77123: no more pending results, returning what we have 51385 1727204591.77126: results queue empty 51385 1727204591.77127: checking for any_errors_fatal 51385 1727204591.77128: done checking for any_errors_fatal 51385 1727204591.77129: checking for max_fail_percentage 51385 1727204591.77131: done checking for max_fail_percentage 51385 1727204591.77131: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.77132: done checking to see if all hosts have failed 51385 1727204591.77133: getting the remaining hosts for this loop 51385 1727204591.77134: done getting the remaining hosts for this loop 51385 1727204591.77137: getting the next task for host managed-node1 51385 1727204591.77143: done getting next task for host managed-node1 51385 1727204591.77145: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 51385 1727204591.77148: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.77151: getting variables 51385 1727204591.77152: in VariableManager get_vars() 51385 1727204591.77182: Calling all_inventory to load vars for managed-node1 51385 1727204591.77188: Calling groups_inventory to load vars for managed-node1 51385 1727204591.77189: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.77197: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.77199: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.77200: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.77308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.77433: done with get_vars() 51385 1727204591.77440: done getting variables 51385 1727204591.77515: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 51385 1727204591.77601: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.387) 0:00:10.180 ***** 51385 1727204591.77625: entering _queue_task() for managed-node1/assert 51385 1727204591.77626: Creating lock for assert 51385 1727204591.77827: worker is 1 (out of 1 available) 51385 1727204591.77842: exiting _queue_task() for managed-node1/assert 51385 1727204591.77852: done queuing things up, now waiting for results queue to drain 51385 1727204591.77854: waiting for pending results... 51385 1727204591.78019: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'lsr101' 51385 1727204591.78092: in run() - task 0affcd87-79f5-6b1f-5706-00000000038c 51385 1727204591.78102: variable 'ansible_search_path' from source: unknown 51385 1727204591.78106: variable 'ansible_search_path' from source: unknown 51385 1727204591.78133: calling self._execute() 51385 1727204591.78199: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.78203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.78212: variable 'omit' from source: magic vars 51385 1727204591.78511: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.78542: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.78557: variable 'omit' from source: magic vars 51385 1727204591.78584: variable 'omit' from source: magic vars 51385 1727204591.78661: variable 'interface' from source: play vars 51385 1727204591.78675: variable 'omit' from source: magic vars 51385 1727204591.78933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204591.78937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204591.78940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204591.78942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.78944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.78947: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204591.78949: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.78951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.78952: Set connection var ansible_pipelining to False 51385 1727204591.78954: Set connection var ansible_shell_type to sh 51385 1727204591.78967: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204591.78970: Set connection var ansible_timeout to 10 51385 1727204591.78972: Set connection var ansible_connection to ssh 51385 1727204591.78977: Set connection var ansible_shell_executable to /bin/sh 51385 1727204591.78999: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.79003: variable 'ansible_connection' from source: unknown 51385 1727204591.79005: variable 'ansible_module_compression' from source: unknown 51385 1727204591.79007: variable 'ansible_shell_type' from source: unknown 51385 1727204591.79009: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.79012: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.79016: variable 'ansible_pipelining' from source: unknown 51385 1727204591.79018: variable 'ansible_timeout' from source: unknown 51385 1727204591.79023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.79167: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204591.79179: variable 'omit' from source: magic vars 51385 1727204591.79182: starting attempt loop 51385 1727204591.79185: running the handler 51385 1727204591.79316: variable 'interface_stat' from source: set_fact 51385 1727204591.79335: Evaluated conditional (interface_stat.stat.exists): True 51385 1727204591.79341: handler run complete 51385 1727204591.79363: attempt loop complete, returning result 51385 1727204591.79366: _execute() done 51385 1727204591.79371: dumping result to json 51385 1727204591.79373: done dumping result, returning 51385 1727204591.79376: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'lsr101' [0affcd87-79f5-6b1f-5706-00000000038c] 51385 1727204591.79383: sending task result for task 0affcd87-79f5-6b1f-5706-00000000038c 51385 1727204591.79468: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000038c 51385 1727204591.79471: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204591.79515: no more pending results, returning what we have 51385 1727204591.79519: results queue empty 51385 1727204591.79520: checking for any_errors_fatal 51385 1727204591.79526: done checking for any_errors_fatal 51385 1727204591.79526: checking for max_fail_percentage 51385 1727204591.79528: done checking for max_fail_percentage 51385 1727204591.79529: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.79530: done checking to see if all hosts have failed 51385 1727204591.79531: getting the remaining hosts for this loop 51385 1727204591.79533: done getting the remaining hosts for this loop 51385 1727204591.79536: getting the next task for host managed-node1 51385 1727204591.79542: done getting next task for host managed-node1 51385 1727204591.79545: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 51385 1727204591.79546: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.79549: getting variables 51385 1727204591.79551: in VariableManager get_vars() 51385 1727204591.79590: Calling all_inventory to load vars for managed-node1 51385 1727204591.79593: Calling groups_inventory to load vars for managed-node1 51385 1727204591.79595: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.79604: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.79606: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.79608: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.79802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.80104: done with get_vars() 51385 1727204591.80121: done getting variables 51385 1727204591.80171: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.025) 0:00:10.205 ***** 51385 1727204591.80191: entering _queue_task() for managed-node1/debug 51385 1727204591.80378: worker is 1 (out of 1 available) 51385 1727204591.80391: exiting _queue_task() for managed-node1/debug 51385 1727204591.80401: done queuing things up, now waiting for results queue to drain 51385 1727204591.80402: waiting for pending results... 51385 1727204591.80580: running TaskExecutor() for managed-node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 51385 1727204591.80635: in run() - task 0affcd87-79f5-6b1f-5706-00000000000e 51385 1727204591.80650: variable 'ansible_search_path' from source: unknown 51385 1727204591.80682: calling self._execute() 51385 1727204591.80746: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.80753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.80765: variable 'omit' from source: magic vars 51385 1727204591.81020: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.81030: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.81036: variable 'omit' from source: magic vars 51385 1727204591.81049: variable 'omit' from source: magic vars 51385 1727204591.81079: variable 'omit' from source: magic vars 51385 1727204591.81112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204591.81138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204591.81155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204591.81169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.81181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.81203: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204591.81214: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.81217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.81289: Set connection var ansible_pipelining to False 51385 1727204591.81293: Set connection var ansible_shell_type to sh 51385 1727204591.81300: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204591.81307: Set connection var ansible_timeout to 10 51385 1727204591.81311: Set connection var ansible_connection to ssh 51385 1727204591.81314: Set connection var ansible_shell_executable to /bin/sh 51385 1727204591.81336: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.81340: variable 'ansible_connection' from source: unknown 51385 1727204591.81343: variable 'ansible_module_compression' from source: unknown 51385 1727204591.81345: variable 'ansible_shell_type' from source: unknown 51385 1727204591.81347: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.81349: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.81351: variable 'ansible_pipelining' from source: unknown 51385 1727204591.81354: variable 'ansible_timeout' from source: unknown 51385 1727204591.81358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.81461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204591.81470: variable 'omit' from source: magic vars 51385 1727204591.81475: starting attempt loop 51385 1727204591.81478: running the handler 51385 1727204591.81514: handler run complete 51385 1727204591.81525: attempt loop complete, returning result 51385 1727204591.81530: _execute() done 51385 1727204591.81532: dumping result to json 51385 1727204591.81535: done dumping result, returning 51385 1727204591.81540: done running TaskExecutor() for managed-node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [0affcd87-79f5-6b1f-5706-00000000000e] 51385 1727204591.81552: sending task result for task 0affcd87-79f5-6b1f-5706-00000000000e 51385 1727204591.81632: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000000e 51385 1727204591.81635: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 51385 1727204591.81685: no more pending results, returning what we have 51385 1727204591.81689: results queue empty 51385 1727204591.81690: checking for any_errors_fatal 51385 1727204591.81697: done checking for any_errors_fatal 51385 1727204591.81697: checking for max_fail_percentage 51385 1727204591.81699: done checking for max_fail_percentage 51385 1727204591.81700: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.81701: done checking to see if all hosts have failed 51385 1727204591.81701: getting the remaining hosts for this loop 51385 1727204591.81703: done getting the remaining hosts for this loop 51385 1727204591.81706: getting the next task for host managed-node1 51385 1727204591.81712: done getting next task for host managed-node1 51385 1727204591.81718: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51385 1727204591.81721: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.81734: getting variables 51385 1727204591.81736: in VariableManager get_vars() 51385 1727204591.81774: Calling all_inventory to load vars for managed-node1 51385 1727204591.81777: Calling groups_inventory to load vars for managed-node1 51385 1727204591.81779: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.81787: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.81789: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.81792: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.81911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.82043: done with get_vars() 51385 1727204591.82051: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.019) 0:00:10.225 ***** 51385 1727204591.82122: entering _queue_task() for managed-node1/include_tasks 51385 1727204591.82306: worker is 1 (out of 1 available) 51385 1727204591.82320: exiting _queue_task() for managed-node1/include_tasks 51385 1727204591.82332: done queuing things up, now waiting for results queue to drain 51385 1727204591.82334: waiting for pending results... 51385 1727204591.82500: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51385 1727204591.82581: in run() - task 0affcd87-79f5-6b1f-5706-000000000016 51385 1727204591.82591: variable 'ansible_search_path' from source: unknown 51385 1727204591.82595: variable 'ansible_search_path' from source: unknown 51385 1727204591.82623: calling self._execute() 51385 1727204591.82687: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.82691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.82700: variable 'omit' from source: magic vars 51385 1727204591.82953: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.82966: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.82980: _execute() done 51385 1727204591.82984: dumping result to json 51385 1727204591.82986: done dumping result, returning 51385 1727204591.82989: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-6b1f-5706-000000000016] 51385 1727204591.82997: sending task result for task 0affcd87-79f5-6b1f-5706-000000000016 51385 1727204591.83077: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000016 51385 1727204591.83080: WORKER PROCESS EXITING 51385 1727204591.83135: no more pending results, returning what we have 51385 1727204591.83139: in VariableManager get_vars() 51385 1727204591.83178: Calling all_inventory to load vars for managed-node1 51385 1727204591.83181: Calling groups_inventory to load vars for managed-node1 51385 1727204591.83184: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.83192: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.83195: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.83198: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.83349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.83478: done with get_vars() 51385 1727204591.83484: variable 'ansible_search_path' from source: unknown 51385 1727204591.83484: variable 'ansible_search_path' from source: unknown 51385 1727204591.83513: we have included files to process 51385 1727204591.83514: generating all_blocks data 51385 1727204591.83515: done generating all_blocks data 51385 1727204591.83518: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204591.83519: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204591.83520: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204591.83980: done processing included file 51385 1727204591.83981: iterating over new_blocks loaded from include file 51385 1727204591.83982: in VariableManager get_vars() 51385 1727204591.83997: done with get_vars() 51385 1727204591.83999: filtering new block on tags 51385 1727204591.84010: done filtering new block on tags 51385 1727204591.84011: in VariableManager get_vars() 51385 1727204591.84024: done with get_vars() 51385 1727204591.84025: filtering new block on tags 51385 1727204591.84038: done filtering new block on tags 51385 1727204591.84039: in VariableManager get_vars() 51385 1727204591.84055: done with get_vars() 51385 1727204591.84056: filtering new block on tags 51385 1727204591.84071: done filtering new block on tags 51385 1727204591.84072: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 51385 1727204591.84076: extending task lists for all hosts with included blocks 51385 1727204591.84568: done extending task lists 51385 1727204591.84569: done processing included files 51385 1727204591.84570: results queue empty 51385 1727204591.84570: checking for any_errors_fatal 51385 1727204591.84573: done checking for any_errors_fatal 51385 1727204591.84573: checking for max_fail_percentage 51385 1727204591.84574: done checking for max_fail_percentage 51385 1727204591.84574: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.84575: done checking to see if all hosts have failed 51385 1727204591.84575: getting the remaining hosts for this loop 51385 1727204591.84576: done getting the remaining hosts for this loop 51385 1727204591.84578: getting the next task for host managed-node1 51385 1727204591.84581: done getting next task for host managed-node1 51385 1727204591.84582: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 51385 1727204591.84585: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.84593: getting variables 51385 1727204591.84594: in VariableManager get_vars() 51385 1727204591.84607: Calling all_inventory to load vars for managed-node1 51385 1727204591.84608: Calling groups_inventory to load vars for managed-node1 51385 1727204591.84609: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.84613: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.84615: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.84616: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.84726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.84851: done with get_vars() 51385 1727204591.84858: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.027) 0:00:10.253 ***** 51385 1727204591.84909: entering _queue_task() for managed-node1/setup 51385 1727204591.85122: worker is 1 (out of 1 available) 51385 1727204591.85135: exiting _queue_task() for managed-node1/setup 51385 1727204591.85146: done queuing things up, now waiting for results queue to drain 51385 1727204591.85148: waiting for pending results... 51385 1727204591.85323: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 51385 1727204591.85413: in run() - task 0affcd87-79f5-6b1f-5706-0000000004bf 51385 1727204591.85423: variable 'ansible_search_path' from source: unknown 51385 1727204591.85427: variable 'ansible_search_path' from source: unknown 51385 1727204591.85455: calling self._execute() 51385 1727204591.85520: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.85523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.85531: variable 'omit' from source: magic vars 51385 1727204591.85809: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.85820: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.85969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204591.88449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204591.88502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204591.88540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204591.88568: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204591.88589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204591.88651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204591.88675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204591.88693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204591.88719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204591.88731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204591.88775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204591.88791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204591.88807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204591.88834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204591.88846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204591.88952: variable '__network_required_facts' from source: role '' defaults 51385 1727204591.88962: variable 'ansible_facts' from source: unknown 51385 1727204591.89021: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 51385 1727204591.89025: when evaluation is False, skipping this task 51385 1727204591.89028: _execute() done 51385 1727204591.89030: dumping result to json 51385 1727204591.89032: done dumping result, returning 51385 1727204591.89038: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-6b1f-5706-0000000004bf] 51385 1727204591.89044: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004bf 51385 1727204591.89133: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004bf 51385 1727204591.89136: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204591.89215: no more pending results, returning what we have 51385 1727204591.89218: results queue empty 51385 1727204591.89219: checking for any_errors_fatal 51385 1727204591.89220: done checking for any_errors_fatal 51385 1727204591.89221: checking for max_fail_percentage 51385 1727204591.89223: done checking for max_fail_percentage 51385 1727204591.89223: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.89224: done checking to see if all hosts have failed 51385 1727204591.89225: getting the remaining hosts for this loop 51385 1727204591.89227: done getting the remaining hosts for this loop 51385 1727204591.89230: getting the next task for host managed-node1 51385 1727204591.89239: done getting next task for host managed-node1 51385 1727204591.89243: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 51385 1727204591.89249: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.89262: getting variables 51385 1727204591.89270: in VariableManager get_vars() 51385 1727204591.89309: Calling all_inventory to load vars for managed-node1 51385 1727204591.89312: Calling groups_inventory to load vars for managed-node1 51385 1727204591.89314: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.89322: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.89325: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.89327: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.89461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.89622: done with get_vars() 51385 1727204591.89631: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.047) 0:00:10.300 ***** 51385 1727204591.89710: entering _queue_task() for managed-node1/stat 51385 1727204591.89904: worker is 1 (out of 1 available) 51385 1727204591.89916: exiting _queue_task() for managed-node1/stat 51385 1727204591.89928: done queuing things up, now waiting for results queue to drain 51385 1727204591.89930: waiting for pending results... 51385 1727204591.90109: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 51385 1727204591.90206: in run() - task 0affcd87-79f5-6b1f-5706-0000000004c1 51385 1727204591.90217: variable 'ansible_search_path' from source: unknown 51385 1727204591.90220: variable 'ansible_search_path' from source: unknown 51385 1727204591.90249: calling self._execute() 51385 1727204591.90310: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.90314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.90323: variable 'omit' from source: magic vars 51385 1727204591.90652: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.90676: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.90841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204591.91119: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204591.91175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204591.91217: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204591.91252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204591.91349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204591.91383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204591.91416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204591.91453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204591.91535: variable '__network_is_ostree' from source: set_fact 51385 1727204591.91545: Evaluated conditional (not __network_is_ostree is defined): False 51385 1727204591.91548: when evaluation is False, skipping this task 51385 1727204591.91553: _execute() done 51385 1727204591.91567: dumping result to json 51385 1727204591.91571: done dumping result, returning 51385 1727204591.91578: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-6b1f-5706-0000000004c1] 51385 1727204591.91584: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c1 51385 1727204591.91883: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c1 51385 1727204591.91886: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 51385 1727204591.91936: no more pending results, returning what we have 51385 1727204591.91939: results queue empty 51385 1727204591.91941: checking for any_errors_fatal 51385 1727204591.91949: done checking for any_errors_fatal 51385 1727204591.91950: checking for max_fail_percentage 51385 1727204591.91951: done checking for max_fail_percentage 51385 1727204591.91952: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.91953: done checking to see if all hosts have failed 51385 1727204591.91954: getting the remaining hosts for this loop 51385 1727204591.91955: done getting the remaining hosts for this loop 51385 1727204591.91959: getting the next task for host managed-node1 51385 1727204591.91967: done getting next task for host managed-node1 51385 1727204591.91971: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 51385 1727204591.91975: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.91987: getting variables 51385 1727204591.91989: in VariableManager get_vars() 51385 1727204591.92026: Calling all_inventory to load vars for managed-node1 51385 1727204591.92029: Calling groups_inventory to load vars for managed-node1 51385 1727204591.92042: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.92052: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.92055: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.92058: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.92265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.92512: done with get_vars() 51385 1727204591.92523: done getting variables 51385 1727204591.92598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.029) 0:00:10.330 ***** 51385 1727204591.92636: entering _queue_task() for managed-node1/set_fact 51385 1727204591.92882: worker is 1 (out of 1 available) 51385 1727204591.92894: exiting _queue_task() for managed-node1/set_fact 51385 1727204591.92907: done queuing things up, now waiting for results queue to drain 51385 1727204591.92908: waiting for pending results... 51385 1727204591.93086: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 51385 1727204591.93184: in run() - task 0affcd87-79f5-6b1f-5706-0000000004c2 51385 1727204591.93195: variable 'ansible_search_path' from source: unknown 51385 1727204591.93199: variable 'ansible_search_path' from source: unknown 51385 1727204591.93226: calling self._execute() 51385 1727204591.93290: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.93293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.93301: variable 'omit' from source: magic vars 51385 1727204591.93567: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.93579: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.93698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204591.94179: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204591.94211: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204591.94237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204591.94261: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204591.94328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204591.94346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204591.94368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204591.94386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204591.94445: variable '__network_is_ostree' from source: set_fact 51385 1727204591.94457: Evaluated conditional (not __network_is_ostree is defined): False 51385 1727204591.94463: when evaluation is False, skipping this task 51385 1727204591.94467: _execute() done 51385 1727204591.94470: dumping result to json 51385 1727204591.94472: done dumping result, returning 51385 1727204591.94475: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-6b1f-5706-0000000004c2] 51385 1727204591.94482: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c2 51385 1727204591.94561: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c2 51385 1727204591.94564: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 51385 1727204591.94605: no more pending results, returning what we have 51385 1727204591.94609: results queue empty 51385 1727204591.94610: checking for any_errors_fatal 51385 1727204591.94614: done checking for any_errors_fatal 51385 1727204591.94614: checking for max_fail_percentage 51385 1727204591.94616: done checking for max_fail_percentage 51385 1727204591.94617: checking to see if all hosts have failed and the running result is not ok 51385 1727204591.94618: done checking to see if all hosts have failed 51385 1727204591.94619: getting the remaining hosts for this loop 51385 1727204591.94620: done getting the remaining hosts for this loop 51385 1727204591.94624: getting the next task for host managed-node1 51385 1727204591.94632: done getting next task for host managed-node1 51385 1727204591.94636: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 51385 1727204591.94639: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204591.94653: getting variables 51385 1727204591.94655: in VariableManager get_vars() 51385 1727204591.94695: Calling all_inventory to load vars for managed-node1 51385 1727204591.94698: Calling groups_inventory to load vars for managed-node1 51385 1727204591.94700: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204591.94708: Calling all_plugins_play to load vars for managed-node1 51385 1727204591.94710: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204591.94713: Calling groups_plugins_play to load vars for managed-node1 51385 1727204591.95047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204591.95261: done with get_vars() 51385 1727204591.95272: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.027) 0:00:10.357 ***** 51385 1727204591.95377: entering _queue_task() for managed-node1/service_facts 51385 1727204591.95379: Creating lock for service_facts 51385 1727204591.95719: worker is 1 (out of 1 available) 51385 1727204591.95732: exiting _queue_task() for managed-node1/service_facts 51385 1727204591.95747: done queuing things up, now waiting for results queue to drain 51385 1727204591.95749: waiting for pending results... 51385 1727204591.96036: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 51385 1727204591.96176: in run() - task 0affcd87-79f5-6b1f-5706-0000000004c4 51385 1727204591.96199: variable 'ansible_search_path' from source: unknown 51385 1727204591.96203: variable 'ansible_search_path' from source: unknown 51385 1727204591.96246: calling self._execute() 51385 1727204591.96335: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.96353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.96361: variable 'omit' from source: magic vars 51385 1727204591.96645: variable 'ansible_distribution_major_version' from source: facts 51385 1727204591.96656: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204591.96667: variable 'omit' from source: magic vars 51385 1727204591.96712: variable 'omit' from source: magic vars 51385 1727204591.96739: variable 'omit' from source: magic vars 51385 1727204591.96796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204591.96841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204591.96877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204591.96902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.96919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204591.96966: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204591.96978: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.96987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.97099: Set connection var ansible_pipelining to False 51385 1727204591.97106: Set connection var ansible_shell_type to sh 51385 1727204591.97120: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204591.97130: Set connection var ansible_timeout to 10 51385 1727204591.97136: Set connection var ansible_connection to ssh 51385 1727204591.97144: Set connection var ansible_shell_executable to /bin/sh 51385 1727204591.97181: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.97190: variable 'ansible_connection' from source: unknown 51385 1727204591.97199: variable 'ansible_module_compression' from source: unknown 51385 1727204591.97205: variable 'ansible_shell_type' from source: unknown 51385 1727204591.97211: variable 'ansible_shell_executable' from source: unknown 51385 1727204591.97216: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204591.97222: variable 'ansible_pipelining' from source: unknown 51385 1727204591.97227: variable 'ansible_timeout' from source: unknown 51385 1727204591.97234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204591.97453: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204591.97471: variable 'omit' from source: magic vars 51385 1727204591.97482: starting attempt loop 51385 1727204591.97496: running the handler 51385 1727204591.97523: _low_level_execute_command(): starting 51385 1727204591.97538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204591.98398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204591.98414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.98427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.98447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.98499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.98514: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204591.98527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.98545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204591.98556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204591.98570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204591.98588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204591.98609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204591.98630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204591.98643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204591.98656: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204591.98673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204591.98761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204591.98779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204591.98794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204591.98896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204592.00460: stdout chunk (state=3): >>>/root <<< 51385 1727204592.00569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204592.00622: stderr chunk (state=3): >>><<< 51385 1727204592.00626: stdout chunk (state=3): >>><<< 51385 1727204592.00652: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204592.00666: _low_level_execute_command(): starting 51385 1727204592.00672: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561 `" && echo ansible-tmp-1727204592.006507-52325-233052180197561="` echo /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561 `" ) && sleep 0' 51385 1727204592.01358: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204592.01369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204592.01412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204592.01418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.01432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204592.01437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204592.01449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204592.01454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204592.01470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.01556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204592.01575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204592.01653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204592.03494: stdout chunk (state=3): >>>ansible-tmp-1727204592.006507-52325-233052180197561=/root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561 <<< 51385 1727204592.03689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204592.03716: stderr chunk (state=3): >>><<< 51385 1727204592.03720: stdout chunk (state=3): >>><<< 51385 1727204592.04171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204592.006507-52325-233052180197561=/root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204592.04175: variable 'ansible_module_compression' from source: unknown 51385 1727204592.04177: ANSIBALLZ: Using lock for service_facts 51385 1727204592.04179: ANSIBALLZ: Acquiring lock 51385 1727204592.04181: ANSIBALLZ: Lock acquired: 140124831875440 51385 1727204592.04183: ANSIBALLZ: Creating module 51385 1727204592.18784: ANSIBALLZ: Writing module into payload 51385 1727204592.19055: ANSIBALLZ: Writing module 51385 1727204592.19060: ANSIBALLZ: Renaming module 51385 1727204592.19063: ANSIBALLZ: Done creating module 51385 1727204592.19067: variable 'ansible_facts' from source: unknown 51385 1727204592.19069: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/AnsiballZ_service_facts.py 51385 1727204592.19975: Sending initial data 51385 1727204592.19979: Sent initial data (161 bytes) 51385 1727204592.21153: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.21253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204592.21610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204592.21783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204592.23533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204592.23595: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204592.23645: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpg3lyz5xz /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/AnsiballZ_service_facts.py <<< 51385 1727204592.23699: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204592.25055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204592.25313: stderr chunk (state=3): >>><<< 51385 1727204592.25317: stdout chunk (state=3): >>><<< 51385 1727204592.25334: done transferring module to remote 51385 1727204592.25346: _low_level_execute_command(): starting 51385 1727204592.25351: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/ /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/AnsiballZ_service_facts.py && sleep 0' 51385 1727204592.26970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204592.27009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204592.27015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204592.27054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204592.27095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204592.27194: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204592.27211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.27221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204592.27223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204592.27225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204592.27228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204592.27229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204592.27231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204592.27233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204592.27235: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204592.27238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.27270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204592.27277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204592.27280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204592.27412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204592.29181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204592.29186: stdout chunk (state=3): >>><<< 51385 1727204592.29191: stderr chunk (state=3): >>><<< 51385 1727204592.29214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204592.29218: _low_level_execute_command(): starting 51385 1727204592.29220: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/AnsiballZ_service_facts.py && sleep 0' 51385 1727204592.31089: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204592.31188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204592.31340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.31343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204592.31357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204592.31363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204592.31387: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204592.31392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204592.31470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204592.31533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204592.31538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204592.31630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204593.60943: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 51385 1727204593.60991: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 51385 1727204593.60995: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 51385 1727204593.61000: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 51385 1727204593.61003: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hi<<< 51385 1727204593.61006: stdout chunk (state=3): >>>bernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 51385 1727204593.62271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204593.62338: stderr chunk (state=3): >>><<< 51385 1727204593.62341: stdout chunk (state=3): >>><<< 51385 1727204593.62383: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204593.63010: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204593.63027: _low_level_execute_command(): starting 51385 1727204593.63036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204592.006507-52325-233052180197561/ > /dev/null 2>&1 && sleep 0' 51385 1727204593.63696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204593.63712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.63728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.63747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.63792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.63803: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204593.63815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.63829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204593.63839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204593.63847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204593.63856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.63872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.63885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.63897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.63907: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204593.63918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.63996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204593.64015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204593.64031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204593.64122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204593.65897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204593.65982: stderr chunk (state=3): >>><<< 51385 1727204593.65986: stdout chunk (state=3): >>><<< 51385 1727204593.66669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204593.66672: handler run complete 51385 1727204593.66675: variable 'ansible_facts' from source: unknown 51385 1727204593.66677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204593.66796: variable 'ansible_facts' from source: unknown 51385 1727204593.66914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204593.67102: attempt loop complete, returning result 51385 1727204593.67113: _execute() done 51385 1727204593.67120: dumping result to json 51385 1727204593.67181: done dumping result, returning 51385 1727204593.67195: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-6b1f-5706-0000000004c4] 51385 1727204593.67206: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c4 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204593.67955: no more pending results, returning what we have 51385 1727204593.67958: results queue empty 51385 1727204593.67962: checking for any_errors_fatal 51385 1727204593.67970: done checking for any_errors_fatal 51385 1727204593.67971: checking for max_fail_percentage 51385 1727204593.67973: done checking for max_fail_percentage 51385 1727204593.67974: checking to see if all hosts have failed and the running result is not ok 51385 1727204593.67975: done checking to see if all hosts have failed 51385 1727204593.67976: getting the remaining hosts for this loop 51385 1727204593.67978: done getting the remaining hosts for this loop 51385 1727204593.67982: getting the next task for host managed-node1 51385 1727204593.67989: done getting next task for host managed-node1 51385 1727204593.67993: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 51385 1727204593.67997: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204593.68008: getting variables 51385 1727204593.68010: in VariableManager get_vars() 51385 1727204593.68052: Calling all_inventory to load vars for managed-node1 51385 1727204593.68055: Calling groups_inventory to load vars for managed-node1 51385 1727204593.68058: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204593.68074: Calling all_plugins_play to load vars for managed-node1 51385 1727204593.68077: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204593.68080: Calling groups_plugins_play to load vars for managed-node1 51385 1727204593.68484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204593.69085: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c4 51385 1727204593.69088: WORKER PROCESS EXITING 51385 1727204593.69348: done with get_vars() 51385 1727204593.69367: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:13 -0400 (0:00:01.740) 0:00:12.098 ***** 51385 1727204593.69472: entering _queue_task() for managed-node1/package_facts 51385 1727204593.69474: Creating lock for package_facts 51385 1727204593.69761: worker is 1 (out of 1 available) 51385 1727204593.69779: exiting _queue_task() for managed-node1/package_facts 51385 1727204593.69793: done queuing things up, now waiting for results queue to drain 51385 1727204593.69794: waiting for pending results... 51385 1727204593.70085: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 51385 1727204593.70241: in run() - task 0affcd87-79f5-6b1f-5706-0000000004c5 51385 1727204593.70265: variable 'ansible_search_path' from source: unknown 51385 1727204593.70274: variable 'ansible_search_path' from source: unknown 51385 1727204593.70314: calling self._execute() 51385 1727204593.70398: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204593.70410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204593.70425: variable 'omit' from source: magic vars 51385 1727204593.70790: variable 'ansible_distribution_major_version' from source: facts 51385 1727204593.70806: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204593.70816: variable 'omit' from source: magic vars 51385 1727204593.70896: variable 'omit' from source: magic vars 51385 1727204593.70932: variable 'omit' from source: magic vars 51385 1727204593.70980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204593.71023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204593.71050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204593.71078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204593.71095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204593.71132: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204593.71140: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204593.71147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204593.71254: Set connection var ansible_pipelining to False 51385 1727204593.71269: Set connection var ansible_shell_type to sh 51385 1727204593.71288: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204593.71299: Set connection var ansible_timeout to 10 51385 1727204593.71305: Set connection var ansible_connection to ssh 51385 1727204593.71313: Set connection var ansible_shell_executable to /bin/sh 51385 1727204593.71342: variable 'ansible_shell_executable' from source: unknown 51385 1727204593.71349: variable 'ansible_connection' from source: unknown 51385 1727204593.71355: variable 'ansible_module_compression' from source: unknown 51385 1727204593.71366: variable 'ansible_shell_type' from source: unknown 51385 1727204593.71373: variable 'ansible_shell_executable' from source: unknown 51385 1727204593.71380: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204593.71387: variable 'ansible_pipelining' from source: unknown 51385 1727204593.71393: variable 'ansible_timeout' from source: unknown 51385 1727204593.71400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204593.71599: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204593.71617: variable 'omit' from source: magic vars 51385 1727204593.71626: starting attempt loop 51385 1727204593.71633: running the handler 51385 1727204593.71662: _low_level_execute_command(): starting 51385 1727204593.71681: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204593.72472: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204593.72490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.72504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.72526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.72574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.72587: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204593.72600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.72621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204593.72632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204593.72642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204593.72653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.72670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.72686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.72698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.72709: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204593.72721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.72806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204593.72829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204593.72848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204593.72938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204593.74508: stdout chunk (state=3): >>>/root <<< 51385 1727204593.74607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204593.74696: stderr chunk (state=3): >>><<< 51385 1727204593.74699: stdout chunk (state=3): >>><<< 51385 1727204593.74822: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204593.74826: _low_level_execute_command(): starting 51385 1727204593.74829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467 `" && echo ansible-tmp-1727204593.747216-52401-122432941223467="` echo /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467 `" ) && sleep 0' 51385 1727204593.75447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204593.75461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.75489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.75507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.75548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.75560: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204593.75578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.75601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204593.75612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204593.75622: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204593.75633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204593.75646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204593.75661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204593.75676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204593.75688: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204593.75707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204593.75777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204593.75806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204593.75823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204593.75908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204593.77749: stdout chunk (state=3): >>>ansible-tmp-1727204593.747216-52401-122432941223467=/root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467 <<< 51385 1727204593.77864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204593.77968: stderr chunk (state=3): >>><<< 51385 1727204593.77984: stdout chunk (state=3): >>><<< 51385 1727204593.78277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204593.747216-52401-122432941223467=/root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204593.78281: variable 'ansible_module_compression' from source: unknown 51385 1727204593.78283: ANSIBALLZ: Using lock for package_facts 51385 1727204593.78285: ANSIBALLZ: Acquiring lock 51385 1727204593.78287: ANSIBALLZ: Lock acquired: 140124833185232 51385 1727204593.78289: ANSIBALLZ: Creating module 51385 1727204594.19632: ANSIBALLZ: Writing module into payload 51385 1727204594.19749: ANSIBALLZ: Writing module 51385 1727204594.19782: ANSIBALLZ: Renaming module 51385 1727204594.19786: ANSIBALLZ: Done creating module 51385 1727204594.19804: variable 'ansible_facts' from source: unknown 51385 1727204594.19926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/AnsiballZ_package_facts.py 51385 1727204594.20044: Sending initial data 51385 1727204594.20048: Sent initial data (161 bytes) 51385 1727204594.20786: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204594.20792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204594.20839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.20843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204594.20845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.20908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204594.20911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204594.20913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204594.20980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204594.22730: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204594.22780: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204594.22828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpyb1dnk0_ /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/AnsiballZ_package_facts.py <<< 51385 1727204594.22877: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204594.24653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204594.24784: stderr chunk (state=3): >>><<< 51385 1727204594.24788: stdout chunk (state=3): >>><<< 51385 1727204594.24803: done transferring module to remote 51385 1727204594.24813: _low_level_execute_command(): starting 51385 1727204594.24818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/ /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/AnsiballZ_package_facts.py && sleep 0' 51385 1727204594.25308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204594.25313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204594.25342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.25356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.25407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204594.25420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204594.25430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204594.25494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204594.27234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204594.27272: stderr chunk (state=3): >>><<< 51385 1727204594.27276: stdout chunk (state=3): >>><<< 51385 1727204594.27290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204594.27293: _low_level_execute_command(): starting 51385 1727204594.27298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/AnsiballZ_package_facts.py && sleep 0' 51385 1727204594.27788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204594.27791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204594.27819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.27831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.27890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204594.27898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204594.27908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204594.27989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204594.73705: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 51385 1727204594.73745: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 51385 1727204594.73763: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 51385 1727204594.73781: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 51385 1727204594.73833: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 51385 1727204594.73837: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 51385 1727204594.73857: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057"<<< 51385 1727204594.73871: stdout chunk (state=3): >>>, "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-p<<< 51385 1727204594.73879: stdout chunk (state=3): >>>ubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{<<< 51385 1727204594.73884: stdout chunk (state=3): >>>"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "<<< 51385 1727204594.73903: stdout chunk (state=3): >>>perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "releas<<< 51385 1727204594.73907: stdout chunk (state=3): >>>e": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source":<<< 51385 1727204594.73913: stdout chunk (state=3): >>> "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "ep<<< 51385 1727204594.73916: stdout chunk (state=3): >>>och": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 51385 1727204594.75472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204594.75477: stdout chunk (state=3): >>><<< 51385 1727204594.75482: stderr chunk (state=3): >>><<< 51385 1727204594.75526: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204594.80978: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204594.81009: _low_level_execute_command(): starting 51385 1727204594.81013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204593.747216-52401-122432941223467/ > /dev/null 2>&1 && sleep 0' 51385 1727204594.81718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204594.81727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204594.81738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204594.81761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204594.81804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204594.81815: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204594.81821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.81834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204594.81841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204594.81848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204594.81857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204594.81880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204594.81892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204594.81899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204594.81905: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204594.81915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204594.81994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204594.82010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204594.82013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204594.82202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204594.84056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204594.84060: stdout chunk (state=3): >>><<< 51385 1727204594.84071: stderr chunk (state=3): >>><<< 51385 1727204594.84089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204594.84094: handler run complete 51385 1727204594.85020: variable 'ansible_facts' from source: unknown 51385 1727204594.85515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204594.87929: variable 'ansible_facts' from source: unknown 51385 1727204594.88209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204594.88649: attempt loop complete, returning result 51385 1727204594.88662: _execute() done 51385 1727204594.88665: dumping result to json 51385 1727204594.88792: done dumping result, returning 51385 1727204594.88800: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-6b1f-5706-0000000004c5] 51385 1727204594.88806: sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c5 51385 1727204594.92928: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000004c5 51385 1727204594.92932: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204594.93012: no more pending results, returning what we have 51385 1727204594.93014: results queue empty 51385 1727204594.93015: checking for any_errors_fatal 51385 1727204594.93019: done checking for any_errors_fatal 51385 1727204594.93019: checking for max_fail_percentage 51385 1727204594.93021: done checking for max_fail_percentage 51385 1727204594.93022: checking to see if all hosts have failed and the running result is not ok 51385 1727204594.93022: done checking to see if all hosts have failed 51385 1727204594.93023: getting the remaining hosts for this loop 51385 1727204594.93024: done getting the remaining hosts for this loop 51385 1727204594.93028: getting the next task for host managed-node1 51385 1727204594.93035: done getting next task for host managed-node1 51385 1727204594.93038: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51385 1727204594.93041: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204594.93051: getting variables 51385 1727204594.93052: in VariableManager get_vars() 51385 1727204594.93087: Calling all_inventory to load vars for managed-node1 51385 1727204594.93090: Calling groups_inventory to load vars for managed-node1 51385 1727204594.93093: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204594.93102: Calling all_plugins_play to load vars for managed-node1 51385 1727204594.93104: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204594.93107: Calling groups_plugins_play to load vars for managed-node1 51385 1727204594.95599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204594.97779: done with get_vars() 51385 1727204594.97836: done getting variables 51385 1727204594.97919: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:14 -0400 (0:00:01.284) 0:00:13.383 ***** 51385 1727204594.97955: entering _queue_task() for managed-node1/debug 51385 1727204594.98283: worker is 1 (out of 1 available) 51385 1727204594.98296: exiting _queue_task() for managed-node1/debug 51385 1727204594.98312: done queuing things up, now waiting for results queue to drain 51385 1727204594.98314: waiting for pending results... 51385 1727204594.98631: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 51385 1727204594.98849: in run() - task 0affcd87-79f5-6b1f-5706-000000000017 51385 1727204594.98888: variable 'ansible_search_path' from source: unknown 51385 1727204594.98893: variable 'ansible_search_path' from source: unknown 51385 1727204594.98935: calling self._execute() 51385 1727204594.99021: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204594.99027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204594.99035: variable 'omit' from source: magic vars 51385 1727204594.99377: variable 'ansible_distribution_major_version' from source: facts 51385 1727204594.99396: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204594.99411: variable 'omit' from source: magic vars 51385 1727204594.99465: variable 'omit' from source: magic vars 51385 1727204594.99570: variable 'network_provider' from source: set_fact 51385 1727204594.99595: variable 'omit' from source: magic vars 51385 1727204594.99642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204594.99693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204594.99723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204594.99749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204594.99770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204594.99806: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204594.99813: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204594.99820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204594.99922: Set connection var ansible_pipelining to False 51385 1727204594.99931: Set connection var ansible_shell_type to sh 51385 1727204594.99944: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204594.99956: Set connection var ansible_timeout to 10 51385 1727204594.99968: Set connection var ansible_connection to ssh 51385 1727204594.99980: Set connection var ansible_shell_executable to /bin/sh 51385 1727204595.00008: variable 'ansible_shell_executable' from source: unknown 51385 1727204595.00015: variable 'ansible_connection' from source: unknown 51385 1727204595.00021: variable 'ansible_module_compression' from source: unknown 51385 1727204595.00027: variable 'ansible_shell_type' from source: unknown 51385 1727204595.00032: variable 'ansible_shell_executable' from source: unknown 51385 1727204595.00039: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.00046: variable 'ansible_pipelining' from source: unknown 51385 1727204595.00052: variable 'ansible_timeout' from source: unknown 51385 1727204595.00062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.00207: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204595.00225: variable 'omit' from source: magic vars 51385 1727204595.00235: starting attempt loop 51385 1727204595.00243: running the handler 51385 1727204595.00297: handler run complete 51385 1727204595.00317: attempt loop complete, returning result 51385 1727204595.00325: _execute() done 51385 1727204595.00332: dumping result to json 51385 1727204595.00340: done dumping result, returning 51385 1727204595.00353: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-6b1f-5706-000000000017] 51385 1727204595.00370: sending task result for task 0affcd87-79f5-6b1f-5706-000000000017 ok: [managed-node1] => {} MSG: Using network provider: nm 51385 1727204595.00536: no more pending results, returning what we have 51385 1727204595.00543: results queue empty 51385 1727204595.00544: checking for any_errors_fatal 51385 1727204595.00555: done checking for any_errors_fatal 51385 1727204595.00556: checking for max_fail_percentage 51385 1727204595.00557: done checking for max_fail_percentage 51385 1727204595.00558: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.00559: done checking to see if all hosts have failed 51385 1727204595.00560: getting the remaining hosts for this loop 51385 1727204595.00562: done getting the remaining hosts for this loop 51385 1727204595.00567: getting the next task for host managed-node1 51385 1727204595.00574: done getting next task for host managed-node1 51385 1727204595.00578: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51385 1727204595.00581: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.00597: getting variables 51385 1727204595.00599: in VariableManager get_vars() 51385 1727204595.00639: Calling all_inventory to load vars for managed-node1 51385 1727204595.00642: Calling groups_inventory to load vars for managed-node1 51385 1727204595.00644: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.00655: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.00657: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.00660: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.01193: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000017 51385 1727204595.01196: WORKER PROCESS EXITING 51385 1727204595.01676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.02621: done with get_vars() 51385 1727204595.02645: done getting variables 51385 1727204595.02694: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.047) 0:00:13.431 ***** 51385 1727204595.02718: entering _queue_task() for managed-node1/fail 51385 1727204595.02958: worker is 1 (out of 1 available) 51385 1727204595.02978: exiting _queue_task() for managed-node1/fail 51385 1727204595.02989: done queuing things up, now waiting for results queue to drain 51385 1727204595.02991: waiting for pending results... 51385 1727204595.03173: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51385 1727204595.03266: in run() - task 0affcd87-79f5-6b1f-5706-000000000018 51385 1727204595.03276: variable 'ansible_search_path' from source: unknown 51385 1727204595.03283: variable 'ansible_search_path' from source: unknown 51385 1727204595.03320: calling self._execute() 51385 1727204595.03401: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.03410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.03418: variable 'omit' from source: magic vars 51385 1727204595.03702: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.03715: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.03801: variable 'network_state' from source: role '' defaults 51385 1727204595.03809: Evaluated conditional (network_state != {}): False 51385 1727204595.03812: when evaluation is False, skipping this task 51385 1727204595.03816: _execute() done 51385 1727204595.03820: dumping result to json 51385 1727204595.03822: done dumping result, returning 51385 1727204595.03828: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-6b1f-5706-000000000018] 51385 1727204595.03840: sending task result for task 0affcd87-79f5-6b1f-5706-000000000018 51385 1727204595.03928: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000018 51385 1727204595.03933: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204595.03988: no more pending results, returning what we have 51385 1727204595.03992: results queue empty 51385 1727204595.03993: checking for any_errors_fatal 51385 1727204595.03999: done checking for any_errors_fatal 51385 1727204595.04000: checking for max_fail_percentage 51385 1727204595.04002: done checking for max_fail_percentage 51385 1727204595.04002: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.04003: done checking to see if all hosts have failed 51385 1727204595.04004: getting the remaining hosts for this loop 51385 1727204595.04006: done getting the remaining hosts for this loop 51385 1727204595.04009: getting the next task for host managed-node1 51385 1727204595.04016: done getting next task for host managed-node1 51385 1727204595.04019: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51385 1727204595.04023: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.04039: getting variables 51385 1727204595.04041: in VariableManager get_vars() 51385 1727204595.04092: Calling all_inventory to load vars for managed-node1 51385 1727204595.04094: Calling groups_inventory to load vars for managed-node1 51385 1727204595.04096: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.04105: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.04108: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.04110: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.05007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.05955: done with get_vars() 51385 1727204595.05975: done getting variables 51385 1727204595.06022: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.033) 0:00:13.464 ***** 51385 1727204595.06048: entering _queue_task() for managed-node1/fail 51385 1727204595.06272: worker is 1 (out of 1 available) 51385 1727204595.06286: exiting _queue_task() for managed-node1/fail 51385 1727204595.06299: done queuing things up, now waiting for results queue to drain 51385 1727204595.06301: waiting for pending results... 51385 1727204595.06484: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51385 1727204595.06573: in run() - task 0affcd87-79f5-6b1f-5706-000000000019 51385 1727204595.06585: variable 'ansible_search_path' from source: unknown 51385 1727204595.06588: variable 'ansible_search_path' from source: unknown 51385 1727204595.06618: calling self._execute() 51385 1727204595.06691: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.06694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.06703: variable 'omit' from source: magic vars 51385 1727204595.06981: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.06992: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.07078: variable 'network_state' from source: role '' defaults 51385 1727204595.07085: Evaluated conditional (network_state != {}): False 51385 1727204595.07089: when evaluation is False, skipping this task 51385 1727204595.07098: _execute() done 51385 1727204595.07101: dumping result to json 51385 1727204595.07105: done dumping result, returning 51385 1727204595.07112: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-6b1f-5706-000000000019] 51385 1727204595.07118: sending task result for task 0affcd87-79f5-6b1f-5706-000000000019 51385 1727204595.07206: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000019 51385 1727204595.07208: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204595.07252: no more pending results, returning what we have 51385 1727204595.07257: results queue empty 51385 1727204595.07258: checking for any_errors_fatal 51385 1727204595.07268: done checking for any_errors_fatal 51385 1727204595.07268: checking for max_fail_percentage 51385 1727204595.07270: done checking for max_fail_percentage 51385 1727204595.07271: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.07272: done checking to see if all hosts have failed 51385 1727204595.07273: getting the remaining hosts for this loop 51385 1727204595.07274: done getting the remaining hosts for this loop 51385 1727204595.07278: getting the next task for host managed-node1 51385 1727204595.07284: done getting next task for host managed-node1 51385 1727204595.07288: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51385 1727204595.07291: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.07307: getting variables 51385 1727204595.07309: in VariableManager get_vars() 51385 1727204595.07351: Calling all_inventory to load vars for managed-node1 51385 1727204595.07353: Calling groups_inventory to load vars for managed-node1 51385 1727204595.07355: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.07369: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.07372: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.07374: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.08190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.09240: done with get_vars() 51385 1727204595.09261: done getting variables 51385 1727204595.09309: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.032) 0:00:13.497 ***** 51385 1727204595.09334: entering _queue_task() for managed-node1/fail 51385 1727204595.09576: worker is 1 (out of 1 available) 51385 1727204595.09588: exiting _queue_task() for managed-node1/fail 51385 1727204595.09600: done queuing things up, now waiting for results queue to drain 51385 1727204595.09601: waiting for pending results... 51385 1727204595.09784: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51385 1727204595.09877: in run() - task 0affcd87-79f5-6b1f-5706-00000000001a 51385 1727204595.09887: variable 'ansible_search_path' from source: unknown 51385 1727204595.09890: variable 'ansible_search_path' from source: unknown 51385 1727204595.09925: calling self._execute() 51385 1727204595.09993: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.09997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.10006: variable 'omit' from source: magic vars 51385 1727204595.10291: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.10301: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.10426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.12051: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.12108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.12136: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.12162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.12182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.12242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.12266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.12282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.12310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.12327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.12398: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.12413: Evaluated conditional (ansible_distribution_major_version | int > 9): False 51385 1727204595.12416: when evaluation is False, skipping this task 51385 1727204595.12419: _execute() done 51385 1727204595.12423: dumping result to json 51385 1727204595.12425: done dumping result, returning 51385 1727204595.12433: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-6b1f-5706-00000000001a] 51385 1727204595.12444: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001a 51385 1727204595.12533: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001a 51385 1727204595.12536: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 51385 1727204595.12593: no more pending results, returning what we have 51385 1727204595.12597: results queue empty 51385 1727204595.12598: checking for any_errors_fatal 51385 1727204595.12603: done checking for any_errors_fatal 51385 1727204595.12604: checking for max_fail_percentage 51385 1727204595.12605: done checking for max_fail_percentage 51385 1727204595.12606: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.12607: done checking to see if all hosts have failed 51385 1727204595.12608: getting the remaining hosts for this loop 51385 1727204595.12610: done getting the remaining hosts for this loop 51385 1727204595.12613: getting the next task for host managed-node1 51385 1727204595.12624: done getting next task for host managed-node1 51385 1727204595.12628: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51385 1727204595.12631: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.12651: getting variables 51385 1727204595.12653: in VariableManager get_vars() 51385 1727204595.12697: Calling all_inventory to load vars for managed-node1 51385 1727204595.12700: Calling groups_inventory to load vars for managed-node1 51385 1727204595.12702: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.12711: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.12713: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.12716: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.13574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.14521: done with get_vars() 51385 1727204595.14539: done getting variables 51385 1727204595.14621: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.053) 0:00:13.550 ***** 51385 1727204595.14644: entering _queue_task() for managed-node1/dnf 51385 1727204595.14885: worker is 1 (out of 1 available) 51385 1727204595.14898: exiting _queue_task() for managed-node1/dnf 51385 1727204595.14910: done queuing things up, now waiting for results queue to drain 51385 1727204595.14911: waiting for pending results... 51385 1727204595.15090: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51385 1727204595.15181: in run() - task 0affcd87-79f5-6b1f-5706-00000000001b 51385 1727204595.15192: variable 'ansible_search_path' from source: unknown 51385 1727204595.15196: variable 'ansible_search_path' from source: unknown 51385 1727204595.15227: calling self._execute() 51385 1727204595.15295: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.15298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.15307: variable 'omit' from source: magic vars 51385 1727204595.15577: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.15587: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.15725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.17354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.17412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.17440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.17463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.17491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.17555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.17580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.17597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.17627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.17641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.17725: variable 'ansible_distribution' from source: facts 51385 1727204595.17731: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.17747: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 51385 1727204595.17829: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.17922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.17941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.17968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.17996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.18006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.18034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.18055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.18077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.18103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.18113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.18139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.18157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.18181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.18205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.18215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.18321: variable 'network_connections' from source: task vars 51385 1727204595.18330: variable 'interface' from source: play vars 51385 1727204595.18387: variable 'interface' from source: play vars 51385 1727204595.18398: variable 'vlan_interface' from source: play vars 51385 1727204595.18441: variable 'vlan_interface' from source: play vars 51385 1727204595.18447: variable 'interface' from source: play vars 51385 1727204595.18495: variable 'interface' from source: play vars 51385 1727204595.18544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204595.18661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204595.18695: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204595.18988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204595.19009: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204595.19045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204595.19065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204595.19084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.19101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204595.19152: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204595.19311: variable 'network_connections' from source: task vars 51385 1727204595.19314: variable 'interface' from source: play vars 51385 1727204595.19362: variable 'interface' from source: play vars 51385 1727204595.19372: variable 'vlan_interface' from source: play vars 51385 1727204595.19417: variable 'vlan_interface' from source: play vars 51385 1727204595.19423: variable 'interface' from source: play vars 51385 1727204595.19469: variable 'interface' from source: play vars 51385 1727204595.19500: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204595.19503: when evaluation is False, skipping this task 51385 1727204595.19506: _execute() done 51385 1727204595.19509: dumping result to json 51385 1727204595.19511: done dumping result, returning 51385 1727204595.19519: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000001b] 51385 1727204595.19524: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001b 51385 1727204595.19623: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001b 51385 1727204595.19625: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204595.19681: no more pending results, returning what we have 51385 1727204595.19685: results queue empty 51385 1727204595.19686: checking for any_errors_fatal 51385 1727204595.19697: done checking for any_errors_fatal 51385 1727204595.19698: checking for max_fail_percentage 51385 1727204595.19700: done checking for max_fail_percentage 51385 1727204595.19701: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.19702: done checking to see if all hosts have failed 51385 1727204595.19703: getting the remaining hosts for this loop 51385 1727204595.19705: done getting the remaining hosts for this loop 51385 1727204595.19708: getting the next task for host managed-node1 51385 1727204595.19715: done getting next task for host managed-node1 51385 1727204595.19719: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51385 1727204595.19722: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.19736: getting variables 51385 1727204595.19737: in VariableManager get_vars() 51385 1727204595.19786: Calling all_inventory to load vars for managed-node1 51385 1727204595.19789: Calling groups_inventory to load vars for managed-node1 51385 1727204595.19792: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.19804: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.19807: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.19810: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.20777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.21707: done with get_vars() 51385 1727204595.21724: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51385 1727204595.21787: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.071) 0:00:13.622 ***** 51385 1727204595.21811: entering _queue_task() for managed-node1/yum 51385 1727204595.21812: Creating lock for yum 51385 1727204595.22054: worker is 1 (out of 1 available) 51385 1727204595.22072: exiting _queue_task() for managed-node1/yum 51385 1727204595.22085: done queuing things up, now waiting for results queue to drain 51385 1727204595.22086: waiting for pending results... 51385 1727204595.22267: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51385 1727204595.22353: in run() - task 0affcd87-79f5-6b1f-5706-00000000001c 51385 1727204595.22367: variable 'ansible_search_path' from source: unknown 51385 1727204595.22370: variable 'ansible_search_path' from source: unknown 51385 1727204595.22404: calling self._execute() 51385 1727204595.22472: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.22476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.22485: variable 'omit' from source: magic vars 51385 1727204595.22766: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.22774: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.22898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.24615: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.24699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.24741: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.24784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.24816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.24904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.24937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.24979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.25027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.25046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.25158: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.25183: Evaluated conditional (ansible_distribution_major_version | int < 8): False 51385 1727204595.25192: when evaluation is False, skipping this task 51385 1727204595.25203: _execute() done 51385 1727204595.25210: dumping result to json 51385 1727204595.25217: done dumping result, returning 51385 1727204595.25228: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000001c] 51385 1727204595.25238: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001c skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 51385 1727204595.25395: no more pending results, returning what we have 51385 1727204595.25399: results queue empty 51385 1727204595.25400: checking for any_errors_fatal 51385 1727204595.25407: done checking for any_errors_fatal 51385 1727204595.25408: checking for max_fail_percentage 51385 1727204595.25410: done checking for max_fail_percentage 51385 1727204595.25411: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.25412: done checking to see if all hosts have failed 51385 1727204595.25413: getting the remaining hosts for this loop 51385 1727204595.25415: done getting the remaining hosts for this loop 51385 1727204595.25418: getting the next task for host managed-node1 51385 1727204595.25425: done getting next task for host managed-node1 51385 1727204595.25429: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51385 1727204595.25432: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.25450: getting variables 51385 1727204595.25452: in VariableManager get_vars() 51385 1727204595.25495: Calling all_inventory to load vars for managed-node1 51385 1727204595.25497: Calling groups_inventory to load vars for managed-node1 51385 1727204595.25499: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.25510: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.25513: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.25516: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.26035: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001c 51385 1727204595.26039: WORKER PROCESS EXITING 51385 1727204595.26745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.27699: done with get_vars() 51385 1727204595.27722: done getting variables 51385 1727204595.27777: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.059) 0:00:13.681 ***** 51385 1727204595.27804: entering _queue_task() for managed-node1/fail 51385 1727204595.28086: worker is 1 (out of 1 available) 51385 1727204595.28099: exiting _queue_task() for managed-node1/fail 51385 1727204595.28110: done queuing things up, now waiting for results queue to drain 51385 1727204595.28112: waiting for pending results... 51385 1727204595.28405: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51385 1727204595.28553: in run() - task 0affcd87-79f5-6b1f-5706-00000000001d 51385 1727204595.28582: variable 'ansible_search_path' from source: unknown 51385 1727204595.28590: variable 'ansible_search_path' from source: unknown 51385 1727204595.28627: calling self._execute() 51385 1727204595.28721: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.28733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.28747: variable 'omit' from source: magic vars 51385 1727204595.29123: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.29141: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.29270: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.29469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.32246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.32322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.32372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.32424: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.32455: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.32539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.32585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.32616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.32665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.32691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.32740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.32773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.32806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.32850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.32871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.32913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.32935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.32957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.32999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.33020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.33197: variable 'network_connections' from source: task vars 51385 1727204595.33215: variable 'interface' from source: play vars 51385 1727204595.33309: variable 'interface' from source: play vars 51385 1727204595.33327: variable 'vlan_interface' from source: play vars 51385 1727204595.33404: variable 'vlan_interface' from source: play vars 51385 1727204595.33417: variable 'interface' from source: play vars 51385 1727204595.33486: variable 'interface' from source: play vars 51385 1727204595.33571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204595.33754: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204595.33806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204595.33842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204595.33883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204595.33932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204595.33979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204595.34015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.34048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204595.34126: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204595.34387: variable 'network_connections' from source: task vars 51385 1727204595.34397: variable 'interface' from source: play vars 51385 1727204595.34469: variable 'interface' from source: play vars 51385 1727204595.34484: variable 'vlan_interface' from source: play vars 51385 1727204595.34548: variable 'vlan_interface' from source: play vars 51385 1727204595.34562: variable 'interface' from source: play vars 51385 1727204595.34626: variable 'interface' from source: play vars 51385 1727204595.34677: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204595.34693: when evaluation is False, skipping this task 51385 1727204595.34700: _execute() done 51385 1727204595.34707: dumping result to json 51385 1727204595.34715: done dumping result, returning 51385 1727204595.34726: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000001d] 51385 1727204595.34738: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204595.34912: no more pending results, returning what we have 51385 1727204595.34917: results queue empty 51385 1727204595.34918: checking for any_errors_fatal 51385 1727204595.34925: done checking for any_errors_fatal 51385 1727204595.34926: checking for max_fail_percentage 51385 1727204595.34928: done checking for max_fail_percentage 51385 1727204595.34929: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.34930: done checking to see if all hosts have failed 51385 1727204595.34931: getting the remaining hosts for this loop 51385 1727204595.34933: done getting the remaining hosts for this loop 51385 1727204595.34937: getting the next task for host managed-node1 51385 1727204595.34945: done getting next task for host managed-node1 51385 1727204595.34949: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51385 1727204595.34953: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.34974: getting variables 51385 1727204595.34976: in VariableManager get_vars() 51385 1727204595.35020: Calling all_inventory to load vars for managed-node1 51385 1727204595.35022: Calling groups_inventory to load vars for managed-node1 51385 1727204595.35025: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.35036: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.35039: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.35042: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.36287: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001d 51385 1727204595.36291: WORKER PROCESS EXITING 51385 1727204595.36910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.38648: done with get_vars() 51385 1727204595.38683: done getting variables 51385 1727204595.38748: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.109) 0:00:13.791 ***** 51385 1727204595.38788: entering _queue_task() for managed-node1/package 51385 1727204595.39117: worker is 1 (out of 1 available) 51385 1727204595.39130: exiting _queue_task() for managed-node1/package 51385 1727204595.39142: done queuing things up, now waiting for results queue to drain 51385 1727204595.39143: waiting for pending results... 51385 1727204595.39467: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 51385 1727204595.39619: in run() - task 0affcd87-79f5-6b1f-5706-00000000001e 51385 1727204595.39641: variable 'ansible_search_path' from source: unknown 51385 1727204595.39650: variable 'ansible_search_path' from source: unknown 51385 1727204595.39705: calling self._execute() 51385 1727204595.39797: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.39813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.39829: variable 'omit' from source: magic vars 51385 1727204595.40228: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.40251: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.40467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204595.40744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204595.40800: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204595.40839: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204595.40883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204595.41001: variable 'network_packages' from source: role '' defaults 51385 1727204595.41118: variable '__network_provider_setup' from source: role '' defaults 51385 1727204595.41133: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204595.41211: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204595.41228: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204595.41297: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204595.41497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.43704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.43782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.43838: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.43880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.43913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.44002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.44036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.44074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.44119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.44139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.44195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.44221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.44249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.44301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.44318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.44566: variable '__network_packages_default_gobject_packages' from source: role '' defaults 51385 1727204595.44691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.44721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.44747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.44794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.44816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.44918: variable 'ansible_python' from source: facts 51385 1727204595.44947: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 51385 1727204595.45040: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204595.45128: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204595.45268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.45297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.45324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.45374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.45392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.45438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.45482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.45509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.45552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.45582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.45731: variable 'network_connections' from source: task vars 51385 1727204595.45742: variable 'interface' from source: play vars 51385 1727204595.45848: variable 'interface' from source: play vars 51385 1727204595.45872: variable 'vlan_interface' from source: play vars 51385 1727204595.45977: variable 'vlan_interface' from source: play vars 51385 1727204595.45992: variable 'interface' from source: play vars 51385 1727204595.46091: variable 'interface' from source: play vars 51385 1727204595.46163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204595.46194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204595.46228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.46257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204595.46314: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.46623: variable 'network_connections' from source: task vars 51385 1727204595.46634: variable 'interface' from source: play vars 51385 1727204595.46744: variable 'interface' from source: play vars 51385 1727204595.46768: variable 'vlan_interface' from source: play vars 51385 1727204595.46876: variable 'vlan_interface' from source: play vars 51385 1727204595.46889: variable 'interface' from source: play vars 51385 1727204595.46994: variable 'interface' from source: play vars 51385 1727204595.47057: variable '__network_packages_default_wireless' from source: role '' defaults 51385 1727204595.47148: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.47482: variable 'network_connections' from source: task vars 51385 1727204595.47493: variable 'interface' from source: play vars 51385 1727204595.47568: variable 'interface' from source: play vars 51385 1727204595.47583: variable 'vlan_interface' from source: play vars 51385 1727204595.47653: variable 'vlan_interface' from source: play vars 51385 1727204595.47669: variable 'interface' from source: play vars 51385 1727204595.47734: variable 'interface' from source: play vars 51385 1727204595.47774: variable '__network_packages_default_team' from source: role '' defaults 51385 1727204595.47861: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204595.48189: variable 'network_connections' from source: task vars 51385 1727204595.48199: variable 'interface' from source: play vars 51385 1727204595.48268: variable 'interface' from source: play vars 51385 1727204595.48285: variable 'vlan_interface' from source: play vars 51385 1727204595.48352: variable 'vlan_interface' from source: play vars 51385 1727204595.48368: variable 'interface' from source: play vars 51385 1727204595.48440: variable 'interface' from source: play vars 51385 1727204595.48522: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204595.48588: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204595.48600: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204595.48670: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204595.48907: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 51385 1727204595.49450: variable 'network_connections' from source: task vars 51385 1727204595.49463: variable 'interface' from source: play vars 51385 1727204595.49531: variable 'interface' from source: play vars 51385 1727204595.49545: variable 'vlan_interface' from source: play vars 51385 1727204595.49614: variable 'vlan_interface' from source: play vars 51385 1727204595.49626: variable 'interface' from source: play vars 51385 1727204595.49695: variable 'interface' from source: play vars 51385 1727204595.49715: variable 'ansible_distribution' from source: facts 51385 1727204595.49723: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.49730: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.49754: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 51385 1727204595.49922: variable 'ansible_distribution' from source: facts 51385 1727204595.49929: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.49937: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.49950: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 51385 1727204595.50114: variable 'ansible_distribution' from source: facts 51385 1727204595.50122: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.50133: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.50174: variable 'network_provider' from source: set_fact 51385 1727204595.50193: variable 'ansible_facts' from source: unknown 51385 1727204595.50958: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 51385 1727204595.50972: when evaluation is False, skipping this task 51385 1727204595.50982: _execute() done 51385 1727204595.50988: dumping result to json 51385 1727204595.50995: done dumping result, returning 51385 1727204595.51010: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-6b1f-5706-00000000001e] 51385 1727204595.51020: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001e skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 51385 1727204595.51174: no more pending results, returning what we have 51385 1727204595.51179: results queue empty 51385 1727204595.51180: checking for any_errors_fatal 51385 1727204595.51186: done checking for any_errors_fatal 51385 1727204595.51187: checking for max_fail_percentage 51385 1727204595.51189: done checking for max_fail_percentage 51385 1727204595.51190: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.51191: done checking to see if all hosts have failed 51385 1727204595.51191: getting the remaining hosts for this loop 51385 1727204595.51193: done getting the remaining hosts for this loop 51385 1727204595.51197: getting the next task for host managed-node1 51385 1727204595.51204: done getting next task for host managed-node1 51385 1727204595.51209: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51385 1727204595.51217: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.51233: getting variables 51385 1727204595.51235: in VariableManager get_vars() 51385 1727204595.51282: Calling all_inventory to load vars for managed-node1 51385 1727204595.51285: Calling groups_inventory to load vars for managed-node1 51385 1727204595.51288: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.51299: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.51302: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.51304: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.52283: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001e 51385 1727204595.52287: WORKER PROCESS EXITING 51385 1727204595.53029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.54821: done with get_vars() 51385 1727204595.54849: done getting variables 51385 1727204595.54919: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.161) 0:00:13.953 ***** 51385 1727204595.54957: entering _queue_task() for managed-node1/package 51385 1727204595.55311: worker is 1 (out of 1 available) 51385 1727204595.55326: exiting _queue_task() for managed-node1/package 51385 1727204595.55338: done queuing things up, now waiting for results queue to drain 51385 1727204595.55340: waiting for pending results... 51385 1727204595.55640: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51385 1727204595.55780: in run() - task 0affcd87-79f5-6b1f-5706-00000000001f 51385 1727204595.55804: variable 'ansible_search_path' from source: unknown 51385 1727204595.55813: variable 'ansible_search_path' from source: unknown 51385 1727204595.55855: calling self._execute() 51385 1727204595.55950: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.55962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.55976: variable 'omit' from source: magic vars 51385 1727204595.56347: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.56369: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.56498: variable 'network_state' from source: role '' defaults 51385 1727204595.56512: Evaluated conditional (network_state != {}): False 51385 1727204595.56519: when evaluation is False, skipping this task 51385 1727204595.56525: _execute() done 51385 1727204595.56532: dumping result to json 51385 1727204595.56539: done dumping result, returning 51385 1727204595.56553: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-6b1f-5706-00000000001f] 51385 1727204595.56571: sending task result for task 0affcd87-79f5-6b1f-5706-00000000001f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204595.56722: no more pending results, returning what we have 51385 1727204595.56727: results queue empty 51385 1727204595.56728: checking for any_errors_fatal 51385 1727204595.56733: done checking for any_errors_fatal 51385 1727204595.56734: checking for max_fail_percentage 51385 1727204595.56736: done checking for max_fail_percentage 51385 1727204595.56737: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.56738: done checking to see if all hosts have failed 51385 1727204595.56739: getting the remaining hosts for this loop 51385 1727204595.56741: done getting the remaining hosts for this loop 51385 1727204595.56745: getting the next task for host managed-node1 51385 1727204595.56752: done getting next task for host managed-node1 51385 1727204595.56757: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51385 1727204595.56763: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.56782: getting variables 51385 1727204595.56784: in VariableManager get_vars() 51385 1727204595.56827: Calling all_inventory to load vars for managed-node1 51385 1727204595.56830: Calling groups_inventory to load vars for managed-node1 51385 1727204595.56833: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.56847: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.56849: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.56852: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.61611: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000001f 51385 1727204595.61615: WORKER PROCESS EXITING 51385 1727204595.62542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.63735: done with get_vars() 51385 1727204595.63752: done getting variables 51385 1727204595.63792: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.088) 0:00:14.042 ***** 51385 1727204595.63814: entering _queue_task() for managed-node1/package 51385 1727204595.64041: worker is 1 (out of 1 available) 51385 1727204595.64055: exiting _queue_task() for managed-node1/package 51385 1727204595.64071: done queuing things up, now waiting for results queue to drain 51385 1727204595.64073: waiting for pending results... 51385 1727204595.64248: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51385 1727204595.64336: in run() - task 0affcd87-79f5-6b1f-5706-000000000020 51385 1727204595.64349: variable 'ansible_search_path' from source: unknown 51385 1727204595.64353: variable 'ansible_search_path' from source: unknown 51385 1727204595.64384: calling self._execute() 51385 1727204595.64451: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.64455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.64466: variable 'omit' from source: magic vars 51385 1727204595.64750: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.64769: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.64901: variable 'network_state' from source: role '' defaults 51385 1727204595.64915: Evaluated conditional (network_state != {}): False 51385 1727204595.64921: when evaluation is False, skipping this task 51385 1727204595.64926: _execute() done 51385 1727204595.64932: dumping result to json 51385 1727204595.64937: done dumping result, returning 51385 1727204595.64946: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-6b1f-5706-000000000020] 51385 1727204595.64967: sending task result for task 0affcd87-79f5-6b1f-5706-000000000020 51385 1727204595.65098: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000020 51385 1727204595.65106: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204595.65165: no more pending results, returning what we have 51385 1727204595.65169: results queue empty 51385 1727204595.65170: checking for any_errors_fatal 51385 1727204595.65178: done checking for any_errors_fatal 51385 1727204595.65179: checking for max_fail_percentage 51385 1727204595.65180: done checking for max_fail_percentage 51385 1727204595.65181: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.65182: done checking to see if all hosts have failed 51385 1727204595.65183: getting the remaining hosts for this loop 51385 1727204595.65184: done getting the remaining hosts for this loop 51385 1727204595.65188: getting the next task for host managed-node1 51385 1727204595.65194: done getting next task for host managed-node1 51385 1727204595.65197: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51385 1727204595.65200: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.65217: getting variables 51385 1727204595.65219: in VariableManager get_vars() 51385 1727204595.65257: Calling all_inventory to load vars for managed-node1 51385 1727204595.65260: Calling groups_inventory to load vars for managed-node1 51385 1727204595.65262: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.65275: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.65277: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.65280: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.66615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.67571: done with get_vars() 51385 1727204595.67589: done getting variables 51385 1727204595.67666: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.038) 0:00:14.080 ***** 51385 1727204595.67692: entering _queue_task() for managed-node1/service 51385 1727204595.67693: Creating lock for service 51385 1727204595.67919: worker is 1 (out of 1 available) 51385 1727204595.67932: exiting _queue_task() for managed-node1/service 51385 1727204595.67944: done queuing things up, now waiting for results queue to drain 51385 1727204595.67946: waiting for pending results... 51385 1727204595.68144: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51385 1727204595.68281: in run() - task 0affcd87-79f5-6b1f-5706-000000000021 51385 1727204595.68300: variable 'ansible_search_path' from source: unknown 51385 1727204595.68308: variable 'ansible_search_path' from source: unknown 51385 1727204595.68349: calling self._execute() 51385 1727204595.68450: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.68470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.68485: variable 'omit' from source: magic vars 51385 1727204595.68896: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.68922: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.69051: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.69272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.71192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.71247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.71278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.71305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.71330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.71389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.71412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.71433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.71460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.71475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.71507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.71526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.71545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.71576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.71586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.71613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.71632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.71651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.71682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.71692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.71814: variable 'network_connections' from source: task vars 51385 1727204595.71825: variable 'interface' from source: play vars 51385 1727204595.71891: variable 'interface' from source: play vars 51385 1727204595.71921: variable 'vlan_interface' from source: play vars 51385 1727204595.72016: variable 'vlan_interface' from source: play vars 51385 1727204595.72324: variable 'interface' from source: play vars 51385 1727204595.72328: variable 'interface' from source: play vars 51385 1727204595.72331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204595.72356: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204595.72400: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204595.72435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204595.72479: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204595.72524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204595.72549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204595.72589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.72625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204595.72694: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204595.72948: variable 'network_connections' from source: task vars 51385 1727204595.72955: variable 'interface' from source: play vars 51385 1727204595.73032: variable 'interface' from source: play vars 51385 1727204595.73035: variable 'vlan_interface' from source: play vars 51385 1727204595.73091: variable 'vlan_interface' from source: play vars 51385 1727204595.73094: variable 'interface' from source: play vars 51385 1727204595.73147: variable 'interface' from source: play vars 51385 1727204595.73189: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204595.73204: when evaluation is False, skipping this task 51385 1727204595.73210: _execute() done 51385 1727204595.73216: dumping result to json 51385 1727204595.73222: done dumping result, returning 51385 1727204595.73232: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-000000000021] 51385 1727204595.73244: sending task result for task 0affcd87-79f5-6b1f-5706-000000000021 51385 1727204595.73351: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000021 51385 1727204595.73362: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204595.73415: no more pending results, returning what we have 51385 1727204595.73418: results queue empty 51385 1727204595.73419: checking for any_errors_fatal 51385 1727204595.73427: done checking for any_errors_fatal 51385 1727204595.73428: checking for max_fail_percentage 51385 1727204595.73430: done checking for max_fail_percentage 51385 1727204595.73430: checking to see if all hosts have failed and the running result is not ok 51385 1727204595.73431: done checking to see if all hosts have failed 51385 1727204595.73432: getting the remaining hosts for this loop 51385 1727204595.73433: done getting the remaining hosts for this loop 51385 1727204595.73437: getting the next task for host managed-node1 51385 1727204595.73443: done getting next task for host managed-node1 51385 1727204595.73447: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51385 1727204595.73450: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204595.73463: getting variables 51385 1727204595.73467: in VariableManager get_vars() 51385 1727204595.73509: Calling all_inventory to load vars for managed-node1 51385 1727204595.73511: Calling groups_inventory to load vars for managed-node1 51385 1727204595.73513: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204595.73523: Calling all_plugins_play to load vars for managed-node1 51385 1727204595.73525: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204595.73528: Calling groups_plugins_play to load vars for managed-node1 51385 1727204595.74821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204595.75783: done with get_vars() 51385 1727204595.75800: done getting variables 51385 1727204595.75842: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.081) 0:00:14.162 ***** 51385 1727204595.75871: entering _queue_task() for managed-node1/service 51385 1727204595.76127: worker is 1 (out of 1 available) 51385 1727204595.76140: exiting _queue_task() for managed-node1/service 51385 1727204595.76151: done queuing things up, now waiting for results queue to drain 51385 1727204595.76152: waiting for pending results... 51385 1727204595.76393: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51385 1727204595.76541: in run() - task 0affcd87-79f5-6b1f-5706-000000000022 51385 1727204595.76567: variable 'ansible_search_path' from source: unknown 51385 1727204595.76576: variable 'ansible_search_path' from source: unknown 51385 1727204595.76621: calling self._execute() 51385 1727204595.76716: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.76731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.76745: variable 'omit' from source: magic vars 51385 1727204595.77135: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.77156: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204595.77332: variable 'network_provider' from source: set_fact 51385 1727204595.77342: variable 'network_state' from source: role '' defaults 51385 1727204595.77371: Evaluated conditional (network_provider == "nm" or network_state != {}): True 51385 1727204595.77388: variable 'omit' from source: magic vars 51385 1727204595.77459: variable 'omit' from source: magic vars 51385 1727204595.77485: variable 'network_service_name' from source: role '' defaults 51385 1727204595.77539: variable 'network_service_name' from source: role '' defaults 51385 1727204595.77624: variable '__network_provider_setup' from source: role '' defaults 51385 1727204595.77628: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204595.77680: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204595.77687: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204595.77732: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204595.77889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204595.79704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204595.79747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204595.79788: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204595.79812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204595.79837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204595.79895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.79915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.79936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.79968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.79979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.80011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.80027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.80048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.80079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.80089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.80231: variable '__network_packages_default_gobject_packages' from source: role '' defaults 51385 1727204595.80310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.80326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.80342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.80373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.80387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.80448: variable 'ansible_python' from source: facts 51385 1727204595.80468: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 51385 1727204595.80529: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204595.80591: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204595.80672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.80691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.80710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.80734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.80744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.80780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204595.80800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204595.80819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.80844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204595.80856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204595.80947: variable 'network_connections' from source: task vars 51385 1727204595.80953: variable 'interface' from source: play vars 51385 1727204595.81006: variable 'interface' from source: play vars 51385 1727204595.81023: variable 'vlan_interface' from source: play vars 51385 1727204595.81073: variable 'vlan_interface' from source: play vars 51385 1727204595.81082: variable 'interface' from source: play vars 51385 1727204595.81135: variable 'interface' from source: play vars 51385 1727204595.81208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204595.81341: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204595.81375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204595.81407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204595.81437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204595.81485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204595.81506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204595.81530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204595.81558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204595.81593: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.81770: variable 'network_connections' from source: task vars 51385 1727204595.81782: variable 'interface' from source: play vars 51385 1727204595.81830: variable 'interface' from source: play vars 51385 1727204595.81841: variable 'vlan_interface' from source: play vars 51385 1727204595.81894: variable 'vlan_interface' from source: play vars 51385 1727204595.81903: variable 'interface' from source: play vars 51385 1727204595.81953: variable 'interface' from source: play vars 51385 1727204595.81997: variable '__network_packages_default_wireless' from source: role '' defaults 51385 1727204595.82048: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204595.82239: variable 'network_connections' from source: task vars 51385 1727204595.82242: variable 'interface' from source: play vars 51385 1727204595.82294: variable 'interface' from source: play vars 51385 1727204595.82302: variable 'vlan_interface' from source: play vars 51385 1727204595.82353: variable 'vlan_interface' from source: play vars 51385 1727204595.82361: variable 'interface' from source: play vars 51385 1727204595.82410: variable 'interface' from source: play vars 51385 1727204595.82434: variable '__network_packages_default_team' from source: role '' defaults 51385 1727204595.82488: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204595.82673: variable 'network_connections' from source: task vars 51385 1727204595.82676: variable 'interface' from source: play vars 51385 1727204595.82726: variable 'interface' from source: play vars 51385 1727204595.82733: variable 'vlan_interface' from source: play vars 51385 1727204595.82788: variable 'vlan_interface' from source: play vars 51385 1727204595.82794: variable 'interface' from source: play vars 51385 1727204595.82843: variable 'interface' from source: play vars 51385 1727204595.82892: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204595.82933: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204595.82940: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204595.82988: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204595.83121: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 51385 1727204595.83443: variable 'network_connections' from source: task vars 51385 1727204595.83447: variable 'interface' from source: play vars 51385 1727204595.83492: variable 'interface' from source: play vars 51385 1727204595.83499: variable 'vlan_interface' from source: play vars 51385 1727204595.83543: variable 'vlan_interface' from source: play vars 51385 1727204595.83549: variable 'interface' from source: play vars 51385 1727204595.83595: variable 'interface' from source: play vars 51385 1727204595.83604: variable 'ansible_distribution' from source: facts 51385 1727204595.83607: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.83612: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.83634: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 51385 1727204595.83744: variable 'ansible_distribution' from source: facts 51385 1727204595.83752: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.83754: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.83767: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 51385 1727204595.83881: variable 'ansible_distribution' from source: facts 51385 1727204595.83884: variable '__network_rh_distros' from source: role '' defaults 51385 1727204595.83887: variable 'ansible_distribution_major_version' from source: facts 51385 1727204595.83914: variable 'network_provider' from source: set_fact 51385 1727204595.83930: variable 'omit' from source: magic vars 51385 1727204595.83950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204595.83977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204595.83992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204595.84005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204595.84014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204595.84036: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204595.84040: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.84042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.84114: Set connection var ansible_pipelining to False 51385 1727204595.84122: Set connection var ansible_shell_type to sh 51385 1727204595.84129: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204595.84132: Set connection var ansible_timeout to 10 51385 1727204595.84134: Set connection var ansible_connection to ssh 51385 1727204595.84140: Set connection var ansible_shell_executable to /bin/sh 51385 1727204595.84157: variable 'ansible_shell_executable' from source: unknown 51385 1727204595.84162: variable 'ansible_connection' from source: unknown 51385 1727204595.84166: variable 'ansible_module_compression' from source: unknown 51385 1727204595.84169: variable 'ansible_shell_type' from source: unknown 51385 1727204595.84171: variable 'ansible_shell_executable' from source: unknown 51385 1727204595.84173: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204595.84182: variable 'ansible_pipelining' from source: unknown 51385 1727204595.84184: variable 'ansible_timeout' from source: unknown 51385 1727204595.84190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204595.84249: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204595.84258: variable 'omit' from source: magic vars 51385 1727204595.84264: starting attempt loop 51385 1727204595.84267: running the handler 51385 1727204595.84323: variable 'ansible_facts' from source: unknown 51385 1727204595.84742: _low_level_execute_command(): starting 51385 1727204595.84748: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204595.85262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204595.85287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204595.85300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204595.85315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204595.85366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204595.85379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204595.85453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204595.87149: stdout chunk (state=3): >>>/root <<< 51385 1727204595.87253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204595.87308: stderr chunk (state=3): >>><<< 51385 1727204595.87311: stdout chunk (state=3): >>><<< 51385 1727204595.87332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204595.87342: _low_level_execute_command(): starting 51385 1727204595.87350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879 `" && echo ansible-tmp-1727204595.8733146-52481-273622279401879="` echo /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879 `" ) && sleep 0' 51385 1727204595.87812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204595.87824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204595.87843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204595.87855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204595.87879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204595.87917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204595.87928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204595.87991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204595.89845: stdout chunk (state=3): >>>ansible-tmp-1727204595.8733146-52481-273622279401879=/root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879 <<< 51385 1727204595.89958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204595.90015: stderr chunk (state=3): >>><<< 51385 1727204595.90023: stdout chunk (state=3): >>><<< 51385 1727204595.90041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204595.8733146-52481-273622279401879=/root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204595.90073: variable 'ansible_module_compression' from source: unknown 51385 1727204595.90121: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 51385 1727204595.90125: ANSIBALLZ: Acquiring lock 51385 1727204595.90129: ANSIBALLZ: Lock acquired: 140124837667440 51385 1727204595.90131: ANSIBALLZ: Creating module 51385 1727204596.20541: ANSIBALLZ: Writing module into payload 51385 1727204596.20737: ANSIBALLZ: Writing module 51385 1727204596.20786: ANSIBALLZ: Renaming module 51385 1727204596.20798: ANSIBALLZ: Done creating module 51385 1727204596.20839: variable 'ansible_facts' from source: unknown 51385 1727204596.21034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/AnsiballZ_systemd.py 51385 1727204596.21201: Sending initial data 51385 1727204596.21204: Sent initial data (156 bytes) 51385 1727204596.22197: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204596.22211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.22224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.22242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.22290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.22302: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204596.22314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.22330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204596.22341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204596.22352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204596.22369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.22382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.22396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.22407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.22417: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204596.22428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.22508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204596.22524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204596.22538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204596.22635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204596.24452: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204596.24509: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204596.24568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpq_r2lfec /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/AnsiballZ_systemd.py <<< 51385 1727204596.24622: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204596.27179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204596.27318: stderr chunk (state=3): >>><<< 51385 1727204596.27322: stdout chunk (state=3): >>><<< 51385 1727204596.27343: done transferring module to remote 51385 1727204596.27354: _low_level_execute_command(): starting 51385 1727204596.27361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/ /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/AnsiballZ_systemd.py && sleep 0' 51385 1727204596.28009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204596.28017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.28027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.28041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.28082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.28088: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204596.28098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.28112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204596.28119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204596.28126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204596.28133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.28142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.28153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.28163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.28167: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204596.28177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.28247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204596.28266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204596.28273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204596.28355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204596.30224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204596.30228: stdout chunk (state=3): >>><<< 51385 1727204596.30230: stderr chunk (state=3): >>><<< 51385 1727204596.30271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204596.30280: _low_level_execute_command(): starting 51385 1727204596.30283: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/AnsiballZ_systemd.py && sleep 0' 51385 1727204596.30963: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204596.30979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.30992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.31007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.31049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.31065: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204596.31078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.31093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204596.31103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204596.31111: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204596.31120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.31130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204596.31143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.31154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204596.31171: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204596.31183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.31257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204596.31286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204596.31300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204596.31396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204596.56500: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "70053", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ExecMainStartTimestampMonotonic": "824430121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "70053", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "<<< 51385 1727204596.56548: stdout chunk (state=3): >>>system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5101", "MemoryCurrent": "6086656", "MemoryAvailable": "infinity", "CPUUsageNSec": "141925000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal"<<< 51385 1727204596.56557: stdout chunk (state=3): >>>: "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:41 EDT", "StateChangeTimestampMonotonic": "824517223", "InactiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveExitTimestampMonotonic": "824430408", "ActiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveEnterTimestampMonotonic": "824517223", "ActiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveExitTimestampMonotonic": "824386950", "InactiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveEnterTimestampMonotonic": "824423584", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ConditionTimestampMonotonic": "824424123", "AssertTimestamp": "Tue 2024-09-24 15:02:41 EDT", "AssertTimestampMonotonic": "824424125", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1aab9e9897314f7fb6bad2151914424e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 51385 1727204596.58086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204596.58177: stderr chunk (state=3): >>><<< 51385 1727204596.58181: stdout chunk (state=3): >>><<< 51385 1727204596.58278: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "70053", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ExecMainStartTimestampMonotonic": "824430121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "70053", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5101", "MemoryCurrent": "6086656", "MemoryAvailable": "infinity", "CPUUsageNSec": "141925000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:41 EDT", "StateChangeTimestampMonotonic": "824517223", "InactiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveExitTimestampMonotonic": "824430408", "ActiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveEnterTimestampMonotonic": "824517223", "ActiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveExitTimestampMonotonic": "824386950", "InactiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveEnterTimestampMonotonic": "824423584", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ConditionTimestampMonotonic": "824424123", "AssertTimestamp": "Tue 2024-09-24 15:02:41 EDT", "AssertTimestampMonotonic": "824424125", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1aab9e9897314f7fb6bad2151914424e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204596.58639: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204596.58672: _low_level_execute_command(): starting 51385 1727204596.58692: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204595.8733146-52481-273622279401879/ > /dev/null 2>&1 && sleep 0' 51385 1727204596.59536: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204596.59540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.59583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.59653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204596.59697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204596.59797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204596.61568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204596.61616: stderr chunk (state=3): >>><<< 51385 1727204596.61620: stdout chunk (state=3): >>><<< 51385 1727204596.61636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204596.61647: handler run complete 51385 1727204596.61689: attempt loop complete, returning result 51385 1727204596.61692: _execute() done 51385 1727204596.61695: dumping result to json 51385 1727204596.61705: done dumping result, returning 51385 1727204596.61713: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-6b1f-5706-000000000022] 51385 1727204596.61718: sending task result for task 0affcd87-79f5-6b1f-5706-000000000022 51385 1727204596.61963: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000022 51385 1727204596.61968: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204596.62016: no more pending results, returning what we have 51385 1727204596.62019: results queue empty 51385 1727204596.62020: checking for any_errors_fatal 51385 1727204596.62026: done checking for any_errors_fatal 51385 1727204596.62027: checking for max_fail_percentage 51385 1727204596.62028: done checking for max_fail_percentage 51385 1727204596.62029: checking to see if all hosts have failed and the running result is not ok 51385 1727204596.62030: done checking to see if all hosts have failed 51385 1727204596.62031: getting the remaining hosts for this loop 51385 1727204596.62032: done getting the remaining hosts for this loop 51385 1727204596.62036: getting the next task for host managed-node1 51385 1727204596.62042: done getting next task for host managed-node1 51385 1727204596.62046: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51385 1727204596.62049: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204596.62058: getting variables 51385 1727204596.62060: in VariableManager get_vars() 51385 1727204596.62097: Calling all_inventory to load vars for managed-node1 51385 1727204596.62099: Calling groups_inventory to load vars for managed-node1 51385 1727204596.62101: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204596.62110: Calling all_plugins_play to load vars for managed-node1 51385 1727204596.62113: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204596.62115: Calling groups_plugins_play to load vars for managed-node1 51385 1727204596.64168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204596.65880: done with get_vars() 51385 1727204596.65906: done getting variables 51385 1727204596.65972: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.901) 0:00:15.063 ***** 51385 1727204596.66007: entering _queue_task() for managed-node1/service 51385 1727204596.66330: worker is 1 (out of 1 available) 51385 1727204596.66341: exiting _queue_task() for managed-node1/service 51385 1727204596.66354: done queuing things up, now waiting for results queue to drain 51385 1727204596.66356: waiting for pending results... 51385 1727204596.66645: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51385 1727204596.66801: in run() - task 0affcd87-79f5-6b1f-5706-000000000023 51385 1727204596.66822: variable 'ansible_search_path' from source: unknown 51385 1727204596.66830: variable 'ansible_search_path' from source: unknown 51385 1727204596.66879: calling self._execute() 51385 1727204596.66979: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.66989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.67005: variable 'omit' from source: magic vars 51385 1727204596.67404: variable 'ansible_distribution_major_version' from source: facts 51385 1727204596.67422: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204596.67550: variable 'network_provider' from source: set_fact 51385 1727204596.67569: Evaluated conditional (network_provider == "nm"): True 51385 1727204596.67671: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204596.67772: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204596.67947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204596.70318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204596.70399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204596.70444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204596.70493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204596.70525: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204596.70630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204596.70669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204596.70705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204596.70752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204596.70777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204596.70830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204596.70862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204596.70895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204596.70944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204596.70969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204596.71015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204596.71048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204596.71083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204596.71128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204596.71150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204596.71309: variable 'network_connections' from source: task vars 51385 1727204596.71326: variable 'interface' from source: play vars 51385 1727204596.71415: variable 'interface' from source: play vars 51385 1727204596.71433: variable 'vlan_interface' from source: play vars 51385 1727204596.71507: variable 'vlan_interface' from source: play vars 51385 1727204596.71519: variable 'interface' from source: play vars 51385 1727204596.71590: variable 'interface' from source: play vars 51385 1727204596.71677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204596.71849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204596.71896: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204596.71932: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204596.71968: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204596.72017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204596.72044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204596.72079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204596.72108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204596.72167: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204596.72429: variable 'network_connections' from source: task vars 51385 1727204596.72443: variable 'interface' from source: play vars 51385 1727204596.72502: variable 'interface' from source: play vars 51385 1727204596.72514: variable 'vlan_interface' from source: play vars 51385 1727204596.72576: variable 'vlan_interface' from source: play vars 51385 1727204596.72587: variable 'interface' from source: play vars 51385 1727204596.72668: variable 'interface' from source: play vars 51385 1727204596.72717: Evaluated conditional (__network_wpa_supplicant_required): False 51385 1727204596.72726: when evaluation is False, skipping this task 51385 1727204596.72733: _execute() done 51385 1727204596.72739: dumping result to json 51385 1727204596.72747: done dumping result, returning 51385 1727204596.72758: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-6b1f-5706-000000000023] 51385 1727204596.72779: sending task result for task 0affcd87-79f5-6b1f-5706-000000000023 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 51385 1727204596.72943: no more pending results, returning what we have 51385 1727204596.72947: results queue empty 51385 1727204596.72948: checking for any_errors_fatal 51385 1727204596.72982: done checking for any_errors_fatal 51385 1727204596.72983: checking for max_fail_percentage 51385 1727204596.72985: done checking for max_fail_percentage 51385 1727204596.72986: checking to see if all hosts have failed and the running result is not ok 51385 1727204596.72987: done checking to see if all hosts have failed 51385 1727204596.72988: getting the remaining hosts for this loop 51385 1727204596.72990: done getting the remaining hosts for this loop 51385 1727204596.72995: getting the next task for host managed-node1 51385 1727204596.73002: done getting next task for host managed-node1 51385 1727204596.73007: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51385 1727204596.73010: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204596.73026: getting variables 51385 1727204596.73028: in VariableManager get_vars() 51385 1727204596.73078: Calling all_inventory to load vars for managed-node1 51385 1727204596.73081: Calling groups_inventory to load vars for managed-node1 51385 1727204596.73083: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204596.73094: Calling all_plugins_play to load vars for managed-node1 51385 1727204596.73097: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204596.73100: Calling groups_plugins_play to load vars for managed-node1 51385 1727204596.74384: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000023 51385 1727204596.74388: WORKER PROCESS EXITING 51385 1727204596.74895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204596.76746: done with get_vars() 51385 1727204596.76774: done getting variables 51385 1727204596.76836: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.108) 0:00:15.172 ***** 51385 1727204596.76873: entering _queue_task() for managed-node1/service 51385 1727204596.77197: worker is 1 (out of 1 available) 51385 1727204596.77210: exiting _queue_task() for managed-node1/service 51385 1727204596.77223: done queuing things up, now waiting for results queue to drain 51385 1727204596.77224: waiting for pending results... 51385 1727204596.77512: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 51385 1727204596.77645: in run() - task 0affcd87-79f5-6b1f-5706-000000000024 51385 1727204596.77675: variable 'ansible_search_path' from source: unknown 51385 1727204596.77683: variable 'ansible_search_path' from source: unknown 51385 1727204596.77723: calling self._execute() 51385 1727204596.77817: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.77828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.77843: variable 'omit' from source: magic vars 51385 1727204596.78232: variable 'ansible_distribution_major_version' from source: facts 51385 1727204596.78249: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204596.78358: variable 'network_provider' from source: set_fact 51385 1727204596.78374: Evaluated conditional (network_provider == "initscripts"): False 51385 1727204596.78382: when evaluation is False, skipping this task 51385 1727204596.78390: _execute() done 51385 1727204596.78397: dumping result to json 51385 1727204596.78405: done dumping result, returning 51385 1727204596.78417: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-6b1f-5706-000000000024] 51385 1727204596.78432: sending task result for task 0affcd87-79f5-6b1f-5706-000000000024 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204596.78591: no more pending results, returning what we have 51385 1727204596.78596: results queue empty 51385 1727204596.78598: checking for any_errors_fatal 51385 1727204596.78610: done checking for any_errors_fatal 51385 1727204596.78611: checking for max_fail_percentage 51385 1727204596.78613: done checking for max_fail_percentage 51385 1727204596.78614: checking to see if all hosts have failed and the running result is not ok 51385 1727204596.78615: done checking to see if all hosts have failed 51385 1727204596.78616: getting the remaining hosts for this loop 51385 1727204596.78618: done getting the remaining hosts for this loop 51385 1727204596.78622: getting the next task for host managed-node1 51385 1727204596.78630: done getting next task for host managed-node1 51385 1727204596.78634: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51385 1727204596.78638: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204596.78655: getting variables 51385 1727204596.78657: in VariableManager get_vars() 51385 1727204596.78709: Calling all_inventory to load vars for managed-node1 51385 1727204596.78712: Calling groups_inventory to load vars for managed-node1 51385 1727204596.78715: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204596.78728: Calling all_plugins_play to load vars for managed-node1 51385 1727204596.78731: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204596.78735: Calling groups_plugins_play to load vars for managed-node1 51385 1727204596.79772: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000024 51385 1727204596.79776: WORKER PROCESS EXITING 51385 1727204596.80514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204596.82216: done with get_vars() 51385 1727204596.82250: done getting variables 51385 1727204596.82318: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.054) 0:00:15.227 ***** 51385 1727204596.82355: entering _queue_task() for managed-node1/copy 51385 1727204596.82697: worker is 1 (out of 1 available) 51385 1727204596.82712: exiting _queue_task() for managed-node1/copy 51385 1727204596.82725: done queuing things up, now waiting for results queue to drain 51385 1727204596.82727: waiting for pending results... 51385 1727204596.83027: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51385 1727204596.83181: in run() - task 0affcd87-79f5-6b1f-5706-000000000025 51385 1727204596.83202: variable 'ansible_search_path' from source: unknown 51385 1727204596.83211: variable 'ansible_search_path' from source: unknown 51385 1727204596.83256: calling self._execute() 51385 1727204596.83363: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.83378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.83397: variable 'omit' from source: magic vars 51385 1727204596.83796: variable 'ansible_distribution_major_version' from source: facts 51385 1727204596.83818: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204596.83943: variable 'network_provider' from source: set_fact 51385 1727204596.83954: Evaluated conditional (network_provider == "initscripts"): False 51385 1727204596.83967: when evaluation is False, skipping this task 51385 1727204596.83976: _execute() done 51385 1727204596.83984: dumping result to json 51385 1727204596.83992: done dumping result, returning 51385 1727204596.84002: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-6b1f-5706-000000000025] 51385 1727204596.84014: sending task result for task 0affcd87-79f5-6b1f-5706-000000000025 51385 1727204596.84132: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000025 51385 1727204596.84140: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 51385 1727204596.84198: no more pending results, returning what we have 51385 1727204596.84203: results queue empty 51385 1727204596.84205: checking for any_errors_fatal 51385 1727204596.84212: done checking for any_errors_fatal 51385 1727204596.84212: checking for max_fail_percentage 51385 1727204596.84214: done checking for max_fail_percentage 51385 1727204596.84216: checking to see if all hosts have failed and the running result is not ok 51385 1727204596.84217: done checking to see if all hosts have failed 51385 1727204596.84217: getting the remaining hosts for this loop 51385 1727204596.84219: done getting the remaining hosts for this loop 51385 1727204596.84224: getting the next task for host managed-node1 51385 1727204596.84231: done getting next task for host managed-node1 51385 1727204596.84236: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51385 1727204596.84240: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204596.84256: getting variables 51385 1727204596.84261: in VariableManager get_vars() 51385 1727204596.84310: Calling all_inventory to load vars for managed-node1 51385 1727204596.84313: Calling groups_inventory to load vars for managed-node1 51385 1727204596.84316: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204596.84329: Calling all_plugins_play to load vars for managed-node1 51385 1727204596.84332: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204596.84335: Calling groups_plugins_play to load vars for managed-node1 51385 1727204596.86219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204596.87899: done with get_vars() 51385 1727204596.87927: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.056) 0:00:15.284 ***** 51385 1727204596.88022: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 51385 1727204596.88024: Creating lock for fedora.linux_system_roles.network_connections 51385 1727204596.88365: worker is 1 (out of 1 available) 51385 1727204596.88377: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 51385 1727204596.88388: done queuing things up, now waiting for results queue to drain 51385 1727204596.88389: waiting for pending results... 51385 1727204596.88680: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51385 1727204596.88822: in run() - task 0affcd87-79f5-6b1f-5706-000000000026 51385 1727204596.88848: variable 'ansible_search_path' from source: unknown 51385 1727204596.88854: variable 'ansible_search_path' from source: unknown 51385 1727204596.88897: calling self._execute() 51385 1727204596.89005: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.89020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.89035: variable 'omit' from source: magic vars 51385 1727204596.89453: variable 'ansible_distribution_major_version' from source: facts 51385 1727204596.89479: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204596.89494: variable 'omit' from source: magic vars 51385 1727204596.89553: variable 'omit' from source: magic vars 51385 1727204596.89741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204596.92111: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204596.92188: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204596.92237: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204596.92300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204596.92344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204596.92427: variable 'network_provider' from source: set_fact 51385 1727204596.92533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204596.92572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204596.92591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204596.92617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204596.92629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204596.92689: variable 'omit' from source: magic vars 51385 1727204596.92782: variable 'omit' from source: magic vars 51385 1727204596.92854: variable 'network_connections' from source: task vars 51385 1727204596.92869: variable 'interface' from source: play vars 51385 1727204596.92918: variable 'interface' from source: play vars 51385 1727204596.92928: variable 'vlan_interface' from source: play vars 51385 1727204596.92978: variable 'vlan_interface' from source: play vars 51385 1727204596.92981: variable 'interface' from source: play vars 51385 1727204596.93027: variable 'interface' from source: play vars 51385 1727204596.93155: variable 'omit' from source: magic vars 51385 1727204596.93163: variable '__lsr_ansible_managed' from source: task vars 51385 1727204596.93211: variable '__lsr_ansible_managed' from source: task vars 51385 1727204596.93399: Loaded config def from plugin (lookup/template) 51385 1727204596.93405: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 51385 1727204596.93429: File lookup term: get_ansible_managed.j2 51385 1727204596.93432: variable 'ansible_search_path' from source: unknown 51385 1727204596.93435: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 51385 1727204596.93446: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 51385 1727204596.93459: variable 'ansible_search_path' from source: unknown 51385 1727204596.97761: variable 'ansible_managed' from source: unknown 51385 1727204596.97850: variable 'omit' from source: magic vars 51385 1727204596.97875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204596.97896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204596.97914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204596.97927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204596.97936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204596.97957: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204596.97963: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.97967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.98031: Set connection var ansible_pipelining to False 51385 1727204596.98034: Set connection var ansible_shell_type to sh 51385 1727204596.98042: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204596.98049: Set connection var ansible_timeout to 10 51385 1727204596.98052: Set connection var ansible_connection to ssh 51385 1727204596.98056: Set connection var ansible_shell_executable to /bin/sh 51385 1727204596.98076: variable 'ansible_shell_executable' from source: unknown 51385 1727204596.98079: variable 'ansible_connection' from source: unknown 51385 1727204596.98082: variable 'ansible_module_compression' from source: unknown 51385 1727204596.98085: variable 'ansible_shell_type' from source: unknown 51385 1727204596.98087: variable 'ansible_shell_executable' from source: unknown 51385 1727204596.98089: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204596.98091: variable 'ansible_pipelining' from source: unknown 51385 1727204596.98095: variable 'ansible_timeout' from source: unknown 51385 1727204596.98099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204596.98196: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204596.98204: variable 'omit' from source: magic vars 51385 1727204596.98211: starting attempt loop 51385 1727204596.98214: running the handler 51385 1727204596.98229: _low_level_execute_command(): starting 51385 1727204596.98233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204596.98793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204596.98838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204596.98990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204596.99030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.00673: stdout chunk (state=3): >>>/root <<< 51385 1727204597.00828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.00895: stderr chunk (state=3): >>><<< 51385 1727204597.00904: stdout chunk (state=3): >>><<< 51385 1727204597.00938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.00956: _low_level_execute_command(): starting 51385 1727204597.00973: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661 `" && echo ansible-tmp-1727204597.0094569-52519-195496634025661="` echo /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661 `" ) && sleep 0' 51385 1727204597.01750: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204597.01770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.01786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.01810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.01866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204597.01880: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204597.01896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.01919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204597.01932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204597.01944: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204597.01957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.01984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.02000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.02017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204597.02029: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204597.02044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.02136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.02162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204597.02182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.02284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.04152: stdout chunk (state=3): >>>ansible-tmp-1727204597.0094569-52519-195496634025661=/root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661 <<< 51385 1727204597.04264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.04373: stderr chunk (state=3): >>><<< 51385 1727204597.04397: stdout chunk (state=3): >>><<< 51385 1727204597.04578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204597.0094569-52519-195496634025661=/root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.04582: variable 'ansible_module_compression' from source: unknown 51385 1727204597.04584: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 51385 1727204597.04586: ANSIBALLZ: Acquiring lock 51385 1727204597.04588: ANSIBALLZ: Lock acquired: 140124836224352 51385 1727204597.04589: ANSIBALLZ: Creating module 51385 1727204597.26274: ANSIBALLZ: Writing module into payload 51385 1727204597.26628: ANSIBALLZ: Writing module 51385 1727204597.26653: ANSIBALLZ: Renaming module 51385 1727204597.26657: ANSIBALLZ: Done creating module 51385 1727204597.26681: variable 'ansible_facts' from source: unknown 51385 1727204597.26752: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/AnsiballZ_network_connections.py 51385 1727204597.26870: Sending initial data 51385 1727204597.26881: Sent initial data (168 bytes) 51385 1727204597.27589: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.27596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.27631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.27638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.27647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.27656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.27663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.27732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.27735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204597.27737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.27799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.29519: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204597.29573: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204597.29626: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpsno9yx0o /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/AnsiballZ_network_connections.py <<< 51385 1727204597.29679: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204597.30895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.31018: stderr chunk (state=3): >>><<< 51385 1727204597.31021: stdout chunk (state=3): >>><<< 51385 1727204597.31038: done transferring module to remote 51385 1727204597.31050: _low_level_execute_command(): starting 51385 1727204597.31052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/ /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/AnsiballZ_network_connections.py && sleep 0' 51385 1727204597.31530: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.31534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.31587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.31590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.31592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.31644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204597.31654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.31727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.33428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.33489: stderr chunk (state=3): >>><<< 51385 1727204597.33493: stdout chunk (state=3): >>><<< 51385 1727204597.33507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.33510: _low_level_execute_command(): starting 51385 1727204597.33515: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/AnsiballZ_network_connections.py && sleep 0' 51385 1727204597.33989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.33995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.34024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204597.34029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.34038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204597.34044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.34051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.34068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204597.34080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.34128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.34144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.34209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.65699: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 51385 1727204597.68462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204597.68522: stderr chunk (state=3): >>><<< 51385 1727204597.68527: stdout chunk (state=3): >>><<< 51385 1727204597.68550: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204597.68598: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204597.68605: _low_level_execute_command(): starting 51385 1727204597.68610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204597.0094569-52519-195496634025661/ > /dev/null 2>&1 && sleep 0' 51385 1727204597.69092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204597.69103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204597.69133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.69145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.69155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.69213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.69217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204597.69227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.69301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.71126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.71184: stderr chunk (state=3): >>><<< 51385 1727204597.71188: stdout chunk (state=3): >>><<< 51385 1727204597.71201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.71208: handler run complete 51385 1727204597.71244: attempt loop complete, returning result 51385 1727204597.71247: _execute() done 51385 1727204597.71249: dumping result to json 51385 1727204597.71255: done dumping result, returning 51385 1727204597.71265: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-6b1f-5706-000000000026] 51385 1727204597.71273: sending task result for task 0affcd87-79f5-6b1f-5706-000000000026 51385 1727204597.71395: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000026 51385 1727204597.71398: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active) 51385 1727204597.71522: no more pending results, returning what we have 51385 1727204597.71526: results queue empty 51385 1727204597.71527: checking for any_errors_fatal 51385 1727204597.71533: done checking for any_errors_fatal 51385 1727204597.71534: checking for max_fail_percentage 51385 1727204597.71536: done checking for max_fail_percentage 51385 1727204597.71537: checking to see if all hosts have failed and the running result is not ok 51385 1727204597.71538: done checking to see if all hosts have failed 51385 1727204597.71538: getting the remaining hosts for this loop 51385 1727204597.71540: done getting the remaining hosts for this loop 51385 1727204597.71544: getting the next task for host managed-node1 51385 1727204597.71549: done getting next task for host managed-node1 51385 1727204597.71552: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51385 1727204597.71555: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204597.71575: getting variables 51385 1727204597.71577: in VariableManager get_vars() 51385 1727204597.71616: Calling all_inventory to load vars for managed-node1 51385 1727204597.71619: Calling groups_inventory to load vars for managed-node1 51385 1727204597.71621: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204597.71630: Calling all_plugins_play to load vars for managed-node1 51385 1727204597.71632: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204597.71635: Calling groups_plugins_play to load vars for managed-node1 51385 1727204597.72495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204597.73433: done with get_vars() 51385 1727204597.73451: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.854) 0:00:16.139 ***** 51385 1727204597.73517: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 51385 1727204597.73519: Creating lock for fedora.linux_system_roles.network_state 51385 1727204597.73750: worker is 1 (out of 1 available) 51385 1727204597.73767: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 51385 1727204597.73779: done queuing things up, now waiting for results queue to drain 51385 1727204597.73781: waiting for pending results... 51385 1727204597.73969: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 51385 1727204597.74053: in run() - task 0affcd87-79f5-6b1f-5706-000000000027 51385 1727204597.74071: variable 'ansible_search_path' from source: unknown 51385 1727204597.74076: variable 'ansible_search_path' from source: unknown 51385 1727204597.74106: calling self._execute() 51385 1727204597.74178: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.74190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.74197: variable 'omit' from source: magic vars 51385 1727204597.74475: variable 'ansible_distribution_major_version' from source: facts 51385 1727204597.74485: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204597.74572: variable 'network_state' from source: role '' defaults 51385 1727204597.74582: Evaluated conditional (network_state != {}): False 51385 1727204597.74585: when evaluation is False, skipping this task 51385 1727204597.74587: _execute() done 51385 1727204597.74590: dumping result to json 51385 1727204597.74593: done dumping result, returning 51385 1727204597.74601: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-6b1f-5706-000000000027] 51385 1727204597.74606: sending task result for task 0affcd87-79f5-6b1f-5706-000000000027 51385 1727204597.74692: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000027 51385 1727204597.74695: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204597.74749: no more pending results, returning what we have 51385 1727204597.74752: results queue empty 51385 1727204597.74753: checking for any_errors_fatal 51385 1727204597.74767: done checking for any_errors_fatal 51385 1727204597.74768: checking for max_fail_percentage 51385 1727204597.74769: done checking for max_fail_percentage 51385 1727204597.74770: checking to see if all hosts have failed and the running result is not ok 51385 1727204597.74771: done checking to see if all hosts have failed 51385 1727204597.74772: getting the remaining hosts for this loop 51385 1727204597.74774: done getting the remaining hosts for this loop 51385 1727204597.74777: getting the next task for host managed-node1 51385 1727204597.74783: done getting next task for host managed-node1 51385 1727204597.74786: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51385 1727204597.74788: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204597.74802: getting variables 51385 1727204597.74804: in VariableManager get_vars() 51385 1727204597.74846: Calling all_inventory to load vars for managed-node1 51385 1727204597.74849: Calling groups_inventory to load vars for managed-node1 51385 1727204597.74851: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204597.74859: Calling all_plugins_play to load vars for managed-node1 51385 1727204597.74862: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204597.74866: Calling groups_plugins_play to load vars for managed-node1 51385 1727204597.75744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204597.76673: done with get_vars() 51385 1727204597.76690: done getting variables 51385 1727204597.76732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.032) 0:00:16.171 ***** 51385 1727204597.76755: entering _queue_task() for managed-node1/debug 51385 1727204597.76980: worker is 1 (out of 1 available) 51385 1727204597.76994: exiting _queue_task() for managed-node1/debug 51385 1727204597.77005: done queuing things up, now waiting for results queue to drain 51385 1727204597.77006: waiting for pending results... 51385 1727204597.77194: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51385 1727204597.77287: in run() - task 0affcd87-79f5-6b1f-5706-000000000028 51385 1727204597.77299: variable 'ansible_search_path' from source: unknown 51385 1727204597.77302: variable 'ansible_search_path' from source: unknown 51385 1727204597.77338: calling self._execute() 51385 1727204597.77407: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.77412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.77423: variable 'omit' from source: magic vars 51385 1727204597.77707: variable 'ansible_distribution_major_version' from source: facts 51385 1727204597.77717: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204597.77723: variable 'omit' from source: magic vars 51385 1727204597.77760: variable 'omit' from source: magic vars 51385 1727204597.77794: variable 'omit' from source: magic vars 51385 1727204597.77827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204597.77854: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204597.77880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204597.77897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.77907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.77930: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204597.77933: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.77935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.78015: Set connection var ansible_pipelining to False 51385 1727204597.78018: Set connection var ansible_shell_type to sh 51385 1727204597.78026: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204597.78032: Set connection var ansible_timeout to 10 51385 1727204597.78035: Set connection var ansible_connection to ssh 51385 1727204597.78040: Set connection var ansible_shell_executable to /bin/sh 51385 1727204597.78057: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.78060: variable 'ansible_connection' from source: unknown 51385 1727204597.78067: variable 'ansible_module_compression' from source: unknown 51385 1727204597.78070: variable 'ansible_shell_type' from source: unknown 51385 1727204597.78072: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.78075: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.78078: variable 'ansible_pipelining' from source: unknown 51385 1727204597.78080: variable 'ansible_timeout' from source: unknown 51385 1727204597.78084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.78191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204597.78198: variable 'omit' from source: magic vars 51385 1727204597.78207: starting attempt loop 51385 1727204597.78215: running the handler 51385 1727204597.78313: variable '__network_connections_result' from source: set_fact 51385 1727204597.78369: handler run complete 51385 1727204597.78382: attempt loop complete, returning result 51385 1727204597.78385: _execute() done 51385 1727204597.78387: dumping result to json 51385 1727204597.78390: done dumping result, returning 51385 1727204597.78397: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-6b1f-5706-000000000028] 51385 1727204597.78402: sending task result for task 0affcd87-79f5-6b1f-5706-000000000028 51385 1727204597.78490: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000028 51385 1727204597.78493: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active)" ] } 51385 1727204597.78565: no more pending results, returning what we have 51385 1727204597.78569: results queue empty 51385 1727204597.78570: checking for any_errors_fatal 51385 1727204597.78576: done checking for any_errors_fatal 51385 1727204597.78577: checking for max_fail_percentage 51385 1727204597.78579: done checking for max_fail_percentage 51385 1727204597.78580: checking to see if all hosts have failed and the running result is not ok 51385 1727204597.78581: done checking to see if all hosts have failed 51385 1727204597.78581: getting the remaining hosts for this loop 51385 1727204597.78583: done getting the remaining hosts for this loop 51385 1727204597.78587: getting the next task for host managed-node1 51385 1727204597.78593: done getting next task for host managed-node1 51385 1727204597.78596: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51385 1727204597.78599: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204597.78610: getting variables 51385 1727204597.78611: in VariableManager get_vars() 51385 1727204597.78657: Calling all_inventory to load vars for managed-node1 51385 1727204597.78660: Calling groups_inventory to load vars for managed-node1 51385 1727204597.78662: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204597.78672: Calling all_plugins_play to load vars for managed-node1 51385 1727204597.78675: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204597.78677: Calling groups_plugins_play to load vars for managed-node1 51385 1727204597.79476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204597.80493: done with get_vars() 51385 1727204597.80507: done getting variables 51385 1727204597.80548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.038) 0:00:16.209 ***** 51385 1727204597.80579: entering _queue_task() for managed-node1/debug 51385 1727204597.80792: worker is 1 (out of 1 available) 51385 1727204597.80805: exiting _queue_task() for managed-node1/debug 51385 1727204597.80816: done queuing things up, now waiting for results queue to drain 51385 1727204597.80818: waiting for pending results... 51385 1727204597.81010: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51385 1727204597.81105: in run() - task 0affcd87-79f5-6b1f-5706-000000000029 51385 1727204597.81116: variable 'ansible_search_path' from source: unknown 51385 1727204597.81119: variable 'ansible_search_path' from source: unknown 51385 1727204597.81151: calling self._execute() 51385 1727204597.81219: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.81223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.81236: variable 'omit' from source: magic vars 51385 1727204597.81514: variable 'ansible_distribution_major_version' from source: facts 51385 1727204597.81524: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204597.81530: variable 'omit' from source: magic vars 51385 1727204597.81576: variable 'omit' from source: magic vars 51385 1727204597.81599: variable 'omit' from source: magic vars 51385 1727204597.81632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204597.81659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204597.81680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204597.81701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.81711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.81733: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204597.81737: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.81739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.81818: Set connection var ansible_pipelining to False 51385 1727204597.81822: Set connection var ansible_shell_type to sh 51385 1727204597.81829: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204597.81835: Set connection var ansible_timeout to 10 51385 1727204597.81838: Set connection var ansible_connection to ssh 51385 1727204597.81843: Set connection var ansible_shell_executable to /bin/sh 51385 1727204597.81862: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.81867: variable 'ansible_connection' from source: unknown 51385 1727204597.81870: variable 'ansible_module_compression' from source: unknown 51385 1727204597.81872: variable 'ansible_shell_type' from source: unknown 51385 1727204597.81874: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.81876: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.81878: variable 'ansible_pipelining' from source: unknown 51385 1727204597.81880: variable 'ansible_timeout' from source: unknown 51385 1727204597.81888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.81986: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204597.81994: variable 'omit' from source: magic vars 51385 1727204597.82004: starting attempt loop 51385 1727204597.82014: running the handler 51385 1727204597.82051: variable '__network_connections_result' from source: set_fact 51385 1727204597.82113: variable '__network_connections_result' from source: set_fact 51385 1727204597.82219: handler run complete 51385 1727204597.82244: attempt loop complete, returning result 51385 1727204597.82247: _execute() done 51385 1727204597.82250: dumping result to json 51385 1727204597.82254: done dumping result, returning 51385 1727204597.82264: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-6b1f-5706-000000000029] 51385 1727204597.82268: sending task result for task 0affcd87-79f5-6b1f-5706-000000000029 51385 1727204597.82369: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000029 51385 1727204597.82372: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 1618f11a-f530-4474-ba61-deb3b396c4a9 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, b2e59a26-dd89-4665-aa15-863b790a948c (not-active)" ] } } 51385 1727204597.82478: no more pending results, returning what we have 51385 1727204597.82481: results queue empty 51385 1727204597.82482: checking for any_errors_fatal 51385 1727204597.82485: done checking for any_errors_fatal 51385 1727204597.82486: checking for max_fail_percentage 51385 1727204597.82487: done checking for max_fail_percentage 51385 1727204597.82488: checking to see if all hosts have failed and the running result is not ok 51385 1727204597.82489: done checking to see if all hosts have failed 51385 1727204597.82490: getting the remaining hosts for this loop 51385 1727204597.82491: done getting the remaining hosts for this loop 51385 1727204597.82494: getting the next task for host managed-node1 51385 1727204597.82499: done getting next task for host managed-node1 51385 1727204597.82508: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51385 1727204597.82511: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204597.82519: getting variables 51385 1727204597.82521: in VariableManager get_vars() 51385 1727204597.82557: Calling all_inventory to load vars for managed-node1 51385 1727204597.82562: Calling groups_inventory to load vars for managed-node1 51385 1727204597.82565: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204597.82572: Calling all_plugins_play to load vars for managed-node1 51385 1727204597.82573: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204597.82575: Calling groups_plugins_play to load vars for managed-node1 51385 1727204597.83367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204597.84297: done with get_vars() 51385 1727204597.84312: done getting variables 51385 1727204597.84352: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.037) 0:00:16.247 ***** 51385 1727204597.84376: entering _queue_task() for managed-node1/debug 51385 1727204597.84576: worker is 1 (out of 1 available) 51385 1727204597.84589: exiting _queue_task() for managed-node1/debug 51385 1727204597.84600: done queuing things up, now waiting for results queue to drain 51385 1727204597.84601: waiting for pending results... 51385 1727204597.84786: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51385 1727204597.84876: in run() - task 0affcd87-79f5-6b1f-5706-00000000002a 51385 1727204597.84889: variable 'ansible_search_path' from source: unknown 51385 1727204597.84892: variable 'ansible_search_path' from source: unknown 51385 1727204597.84920: calling self._execute() 51385 1727204597.84994: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.84998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.85006: variable 'omit' from source: magic vars 51385 1727204597.85277: variable 'ansible_distribution_major_version' from source: facts 51385 1727204597.85287: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204597.85371: variable 'network_state' from source: role '' defaults 51385 1727204597.85383: Evaluated conditional (network_state != {}): False 51385 1727204597.85387: when evaluation is False, skipping this task 51385 1727204597.85389: _execute() done 51385 1727204597.85392: dumping result to json 51385 1727204597.85394: done dumping result, returning 51385 1727204597.85397: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-6b1f-5706-00000000002a] 51385 1727204597.85404: sending task result for task 0affcd87-79f5-6b1f-5706-00000000002a 51385 1727204597.85488: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000002a 51385 1727204597.85492: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 51385 1727204597.85539: no more pending results, returning what we have 51385 1727204597.85543: results queue empty 51385 1727204597.85544: checking for any_errors_fatal 51385 1727204597.85550: done checking for any_errors_fatal 51385 1727204597.85551: checking for max_fail_percentage 51385 1727204597.85552: done checking for max_fail_percentage 51385 1727204597.85553: checking to see if all hosts have failed and the running result is not ok 51385 1727204597.85554: done checking to see if all hosts have failed 51385 1727204597.85555: getting the remaining hosts for this loop 51385 1727204597.85556: done getting the remaining hosts for this loop 51385 1727204597.85560: getting the next task for host managed-node1 51385 1727204597.85567: done getting next task for host managed-node1 51385 1727204597.85571: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51385 1727204597.85574: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204597.85587: getting variables 51385 1727204597.85588: in VariableManager get_vars() 51385 1727204597.85631: Calling all_inventory to load vars for managed-node1 51385 1727204597.85634: Calling groups_inventory to load vars for managed-node1 51385 1727204597.85636: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204597.85644: Calling all_plugins_play to load vars for managed-node1 51385 1727204597.85645: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204597.85647: Calling groups_plugins_play to load vars for managed-node1 51385 1727204597.86540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204597.87467: done with get_vars() 51385 1727204597.87482: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:17 -0400 (0:00:00.031) 0:00:16.279 ***** 51385 1727204597.87545: entering _queue_task() for managed-node1/ping 51385 1727204597.87546: Creating lock for ping 51385 1727204597.87752: worker is 1 (out of 1 available) 51385 1727204597.87768: exiting _queue_task() for managed-node1/ping 51385 1727204597.87779: done queuing things up, now waiting for results queue to drain 51385 1727204597.87780: waiting for pending results... 51385 1727204597.87957: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 51385 1727204597.88048: in run() - task 0affcd87-79f5-6b1f-5706-00000000002b 51385 1727204597.88061: variable 'ansible_search_path' from source: unknown 51385 1727204597.88066: variable 'ansible_search_path' from source: unknown 51385 1727204597.88094: calling self._execute() 51385 1727204597.88162: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.88168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.88175: variable 'omit' from source: magic vars 51385 1727204597.88443: variable 'ansible_distribution_major_version' from source: facts 51385 1727204597.88458: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204597.88468: variable 'omit' from source: magic vars 51385 1727204597.88505: variable 'omit' from source: magic vars 51385 1727204597.88529: variable 'omit' from source: magic vars 51385 1727204597.88572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204597.88599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204597.88617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204597.88630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.88639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204597.88671: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204597.88674: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.88677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.88741: Set connection var ansible_pipelining to False 51385 1727204597.88744: Set connection var ansible_shell_type to sh 51385 1727204597.88751: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204597.88761: Set connection var ansible_timeout to 10 51385 1727204597.88767: Set connection var ansible_connection to ssh 51385 1727204597.88777: Set connection var ansible_shell_executable to /bin/sh 51385 1727204597.88793: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.88796: variable 'ansible_connection' from source: unknown 51385 1727204597.88798: variable 'ansible_module_compression' from source: unknown 51385 1727204597.88801: variable 'ansible_shell_type' from source: unknown 51385 1727204597.88803: variable 'ansible_shell_executable' from source: unknown 51385 1727204597.88805: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204597.88809: variable 'ansible_pipelining' from source: unknown 51385 1727204597.88811: variable 'ansible_timeout' from source: unknown 51385 1727204597.88815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204597.88966: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204597.88973: variable 'omit' from source: magic vars 51385 1727204597.88983: starting attempt loop 51385 1727204597.88987: running the handler 51385 1727204597.88998: _low_level_execute_command(): starting 51385 1727204597.89006: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204597.89541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.89558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204597.89573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204597.89584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.89627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.89655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.89704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.91267: stdout chunk (state=3): >>>/root <<< 51385 1727204597.91364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.91418: stderr chunk (state=3): >>><<< 51385 1727204597.91422: stdout chunk (state=3): >>><<< 51385 1727204597.91440: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.91451: _low_level_execute_command(): starting 51385 1727204597.91457: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840 `" && echo ansible-tmp-1727204597.9144025-52545-147892903301840="` echo /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840 `" ) && sleep 0' 51385 1727204597.92086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204597.92113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204597.92137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204597.92157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204597.92252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204597.94084: stdout chunk (state=3): >>>ansible-tmp-1727204597.9144025-52545-147892903301840=/root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840 <<< 51385 1727204597.94248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204597.94251: stderr chunk (state=3): >>><<< 51385 1727204597.94252: stdout chunk (state=3): >>><<< 51385 1727204597.94335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204597.9144025-52545-147892903301840=/root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204597.94340: variable 'ansible_module_compression' from source: unknown 51385 1727204597.94343: ANSIBALLZ: Using lock for ping 51385 1727204597.94345: ANSIBALLZ: Acquiring lock 51385 1727204597.94347: ANSIBALLZ: Lock acquired: 140124833178224 51385 1727204597.94349: ANSIBALLZ: Creating module 51385 1727204598.09952: ANSIBALLZ: Writing module into payload 51385 1727204598.10047: ANSIBALLZ: Writing module 51385 1727204598.10194: ANSIBALLZ: Renaming module 51385 1727204598.10206: ANSIBALLZ: Done creating module 51385 1727204598.10229: variable 'ansible_facts' from source: unknown 51385 1727204598.10324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/AnsiballZ_ping.py 51385 1727204598.10508: Sending initial data 51385 1727204598.10512: Sent initial data (153 bytes) 51385 1727204598.11455: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.11462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.11477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.11480: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.11488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.11511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204598.11514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204598.11517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.11566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.11575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.11647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.13398: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204598.13453: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204598.13517: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpkfi331zd /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/AnsiballZ_ping.py <<< 51385 1727204598.13561: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204598.15740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.16082: stderr chunk (state=3): >>><<< 51385 1727204598.16086: stdout chunk (state=3): >>><<< 51385 1727204598.16089: done transferring module to remote 51385 1727204598.16091: _low_level_execute_command(): starting 51385 1727204598.16093: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/ /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/AnsiballZ_ping.py && sleep 0' 51385 1727204598.17745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.17750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.17755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.17758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.17762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.17765: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.17769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.17771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.17773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.17774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.17776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.17778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.17780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.17782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.17784: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.17785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.17791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.17793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.17795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.17799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.19504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.19509: stdout chunk (state=3): >>><<< 51385 1727204598.19511: stderr chunk (state=3): >>><<< 51385 1727204598.19575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.19579: _low_level_execute_command(): starting 51385 1727204598.19581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/AnsiballZ_ping.py && sleep 0' 51385 1727204598.20495: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.20512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.20526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.20543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.20598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.20611: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.20623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.20639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.20649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.20659: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.20676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.20699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.20716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.20728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.20740: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.20754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.20843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.20869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.20886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.20983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.33886: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 51385 1727204598.34822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204598.34890: stderr chunk (state=3): >>><<< 51385 1727204598.34894: stdout chunk (state=3): >>><<< 51385 1727204598.34910: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204598.34931: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204598.34941: _low_level_execute_command(): starting 51385 1727204598.34946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204597.9144025-52545-147892903301840/ > /dev/null 2>&1 && sleep 0' 51385 1727204598.35429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.35433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.35493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.35497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204598.35499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.35501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204598.35508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.35552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.35560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.35660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.37421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.37480: stderr chunk (state=3): >>><<< 51385 1727204598.37485: stdout chunk (state=3): >>><<< 51385 1727204598.37499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.37506: handler run complete 51385 1727204598.37522: attempt loop complete, returning result 51385 1727204598.37525: _execute() done 51385 1727204598.37530: dumping result to json 51385 1727204598.37534: done dumping result, returning 51385 1727204598.37542: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-6b1f-5706-00000000002b] 51385 1727204598.37547: sending task result for task 0affcd87-79f5-6b1f-5706-00000000002b 51385 1727204598.37639: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000002b 51385 1727204598.37642: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 51385 1727204598.37705: no more pending results, returning what we have 51385 1727204598.37709: results queue empty 51385 1727204598.37710: checking for any_errors_fatal 51385 1727204598.37717: done checking for any_errors_fatal 51385 1727204598.37718: checking for max_fail_percentage 51385 1727204598.37720: done checking for max_fail_percentage 51385 1727204598.37720: checking to see if all hosts have failed and the running result is not ok 51385 1727204598.37721: done checking to see if all hosts have failed 51385 1727204598.37722: getting the remaining hosts for this loop 51385 1727204598.37724: done getting the remaining hosts for this loop 51385 1727204598.37727: getting the next task for host managed-node1 51385 1727204598.37737: done getting next task for host managed-node1 51385 1727204598.37739: ^ task is: TASK: meta (role_complete) 51385 1727204598.37742: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204598.37761: getting variables 51385 1727204598.37765: in VariableManager get_vars() 51385 1727204598.37809: Calling all_inventory to load vars for managed-node1 51385 1727204598.37812: Calling groups_inventory to load vars for managed-node1 51385 1727204598.37814: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.37824: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.37826: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.37829: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.38718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.39680: done with get_vars() 51385 1727204598.39707: done getting variables 51385 1727204598.39775: done queuing things up, now waiting for results queue to drain 51385 1727204598.39777: results queue empty 51385 1727204598.39777: checking for any_errors_fatal 51385 1727204598.39779: done checking for any_errors_fatal 51385 1727204598.39780: checking for max_fail_percentage 51385 1727204598.39780: done checking for max_fail_percentage 51385 1727204598.39781: checking to see if all hosts have failed and the running result is not ok 51385 1727204598.39781: done checking to see if all hosts have failed 51385 1727204598.39782: getting the remaining hosts for this loop 51385 1727204598.39783: done getting the remaining hosts for this loop 51385 1727204598.39785: getting the next task for host managed-node1 51385 1727204598.39788: done getting next task for host managed-node1 51385 1727204598.39789: ^ task is: TASK: Include the task 'assert_device_present.yml' 51385 1727204598.39790: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204598.39793: getting variables 51385 1727204598.39794: in VariableManager get_vars() 51385 1727204598.39806: Calling all_inventory to load vars for managed-node1 51385 1727204598.39808: Calling groups_inventory to load vars for managed-node1 51385 1727204598.39809: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.39813: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.39815: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.39817: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.40601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.41547: done with get_vars() 51385 1727204598.41570: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.540) 0:00:16.820 ***** 51385 1727204598.41624: entering _queue_task() for managed-node1/include_tasks 51385 1727204598.41917: worker is 1 (out of 1 available) 51385 1727204598.41930: exiting _queue_task() for managed-node1/include_tasks 51385 1727204598.41943: done queuing things up, now waiting for results queue to drain 51385 1727204598.41944: waiting for pending results... 51385 1727204598.42133: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 51385 1727204598.42199: in run() - task 0affcd87-79f5-6b1f-5706-00000000005b 51385 1727204598.42209: variable 'ansible_search_path' from source: unknown 51385 1727204598.42239: calling self._execute() 51385 1727204598.42313: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204598.42323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204598.42330: variable 'omit' from source: magic vars 51385 1727204598.42630: variable 'ansible_distribution_major_version' from source: facts 51385 1727204598.42642: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204598.42647: _execute() done 51385 1727204598.42651: dumping result to json 51385 1727204598.42655: done dumping result, returning 51385 1727204598.42660: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-6b1f-5706-00000000005b] 51385 1727204598.42671: sending task result for task 0affcd87-79f5-6b1f-5706-00000000005b 51385 1727204598.42761: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000005b 51385 1727204598.42772: WORKER PROCESS EXITING 51385 1727204598.42799: no more pending results, returning what we have 51385 1727204598.42804: in VariableManager get_vars() 51385 1727204598.42849: Calling all_inventory to load vars for managed-node1 51385 1727204598.42852: Calling groups_inventory to load vars for managed-node1 51385 1727204598.42854: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.42870: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.42873: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.42885: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.43743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.44771: done with get_vars() 51385 1727204598.44785: variable 'ansible_search_path' from source: unknown 51385 1727204598.44797: we have included files to process 51385 1727204598.44797: generating all_blocks data 51385 1727204598.44799: done generating all_blocks data 51385 1727204598.44802: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204598.44803: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204598.44805: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 51385 1727204598.44884: in VariableManager get_vars() 51385 1727204598.44899: done with get_vars() 51385 1727204598.44984: done processing included file 51385 1727204598.44985: iterating over new_blocks loaded from include file 51385 1727204598.44987: in VariableManager get_vars() 51385 1727204598.44999: done with get_vars() 51385 1727204598.45000: filtering new block on tags 51385 1727204598.45012: done filtering new block on tags 51385 1727204598.45014: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 51385 1727204598.45018: extending task lists for all hosts with included blocks 51385 1727204598.47298: done extending task lists 51385 1727204598.47300: done processing included files 51385 1727204598.47300: results queue empty 51385 1727204598.47301: checking for any_errors_fatal 51385 1727204598.47302: done checking for any_errors_fatal 51385 1727204598.47302: checking for max_fail_percentage 51385 1727204598.47303: done checking for max_fail_percentage 51385 1727204598.47304: checking to see if all hosts have failed and the running result is not ok 51385 1727204598.47304: done checking to see if all hosts have failed 51385 1727204598.47305: getting the remaining hosts for this loop 51385 1727204598.47306: done getting the remaining hosts for this loop 51385 1727204598.47307: getting the next task for host managed-node1 51385 1727204598.47310: done getting next task for host managed-node1 51385 1727204598.47312: ^ task is: TASK: Include the task 'get_interface_stat.yml' 51385 1727204598.47314: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204598.47317: getting variables 51385 1727204598.47318: in VariableManager get_vars() 51385 1727204598.47335: Calling all_inventory to load vars for managed-node1 51385 1727204598.47337: Calling groups_inventory to load vars for managed-node1 51385 1727204598.47338: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.47343: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.47344: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.47346: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.48034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.48957: done with get_vars() 51385 1727204598.48975: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.074) 0:00:16.894 ***** 51385 1727204598.49029: entering _queue_task() for managed-node1/include_tasks 51385 1727204598.49277: worker is 1 (out of 1 available) 51385 1727204598.49291: exiting _queue_task() for managed-node1/include_tasks 51385 1727204598.49302: done queuing things up, now waiting for results queue to drain 51385 1727204598.49304: waiting for pending results... 51385 1727204598.49557: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 51385 1727204598.49667: in run() - task 0affcd87-79f5-6b1f-5706-000000000578 51385 1727204598.49675: variable 'ansible_search_path' from source: unknown 51385 1727204598.49683: variable 'ansible_search_path' from source: unknown 51385 1727204598.49736: calling self._execute() 51385 1727204598.49827: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204598.49831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204598.50078: variable 'omit' from source: magic vars 51385 1727204598.50247: variable 'ansible_distribution_major_version' from source: facts 51385 1727204598.50277: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204598.50293: _execute() done 51385 1727204598.50302: dumping result to json 51385 1727204598.50310: done dumping result, returning 51385 1727204598.50319: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-6b1f-5706-000000000578] 51385 1727204598.50330: sending task result for task 0affcd87-79f5-6b1f-5706-000000000578 51385 1727204598.50445: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000578 51385 1727204598.50453: WORKER PROCESS EXITING 51385 1727204598.50500: no more pending results, returning what we have 51385 1727204598.50507: in VariableManager get_vars() 51385 1727204598.50552: Calling all_inventory to load vars for managed-node1 51385 1727204598.50556: Calling groups_inventory to load vars for managed-node1 51385 1727204598.50559: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.50575: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.50578: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.50580: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.51804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.52739: done with get_vars() 51385 1727204598.52755: variable 'ansible_search_path' from source: unknown 51385 1727204598.52757: variable 'ansible_search_path' from source: unknown 51385 1727204598.52787: we have included files to process 51385 1727204598.52788: generating all_blocks data 51385 1727204598.52789: done generating all_blocks data 51385 1727204598.52790: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204598.52790: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204598.52792: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 51385 1727204598.52930: done processing included file 51385 1727204598.52932: iterating over new_blocks loaded from include file 51385 1727204598.52933: in VariableManager get_vars() 51385 1727204598.52947: done with get_vars() 51385 1727204598.52948: filtering new block on tags 51385 1727204598.52961: done filtering new block on tags 51385 1727204598.52963: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 51385 1727204598.52969: extending task lists for all hosts with included blocks 51385 1727204598.53031: done extending task lists 51385 1727204598.53033: done processing included files 51385 1727204598.53033: results queue empty 51385 1727204598.53034: checking for any_errors_fatal 51385 1727204598.53037: done checking for any_errors_fatal 51385 1727204598.53038: checking for max_fail_percentage 51385 1727204598.53039: done checking for max_fail_percentage 51385 1727204598.53039: checking to see if all hosts have failed and the running result is not ok 51385 1727204598.53040: done checking to see if all hosts have failed 51385 1727204598.53040: getting the remaining hosts for this loop 51385 1727204598.53041: done getting the remaining hosts for this loop 51385 1727204598.53043: getting the next task for host managed-node1 51385 1727204598.53046: done getting next task for host managed-node1 51385 1727204598.53048: ^ task is: TASK: Get stat for interface {{ interface }} 51385 1727204598.53050: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204598.53052: getting variables 51385 1727204598.53052: in VariableManager get_vars() 51385 1727204598.53066: Calling all_inventory to load vars for managed-node1 51385 1727204598.53068: Calling groups_inventory to load vars for managed-node1 51385 1727204598.53069: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.53073: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.53075: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.53078: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.53821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204598.55495: done with get_vars() 51385 1727204598.55524: done getting variables 51385 1727204598.55697: variable 'interface' from source: include params 51385 1727204598.55702: variable 'vlan_interface' from source: play vars 51385 1727204598.55767: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.067) 0:00:16.961 ***** 51385 1727204598.55798: entering _queue_task() for managed-node1/stat 51385 1727204598.56136: worker is 1 (out of 1 available) 51385 1727204598.56150: exiting _queue_task() for managed-node1/stat 51385 1727204598.56165: done queuing things up, now waiting for results queue to drain 51385 1727204598.56167: waiting for pending results... 51385 1727204598.56457: running TaskExecutor() for managed-node1/TASK: Get stat for interface lsr101.90 51385 1727204598.56592: in run() - task 0affcd87-79f5-6b1f-5706-00000000069c 51385 1727204598.56617: variable 'ansible_search_path' from source: unknown 51385 1727204598.56625: variable 'ansible_search_path' from source: unknown 51385 1727204598.56671: calling self._execute() 51385 1727204598.56770: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204598.56781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204598.56796: variable 'omit' from source: magic vars 51385 1727204598.57181: variable 'ansible_distribution_major_version' from source: facts 51385 1727204598.57199: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204598.57211: variable 'omit' from source: magic vars 51385 1727204598.57275: variable 'omit' from source: magic vars 51385 1727204598.57387: variable 'interface' from source: include params 51385 1727204598.57396: variable 'vlan_interface' from source: play vars 51385 1727204598.57462: variable 'vlan_interface' from source: play vars 51385 1727204598.57491: variable 'omit' from source: magic vars 51385 1727204598.57540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204598.57587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204598.57616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204598.57639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204598.57654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204598.57692: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204598.57704: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204598.57711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204598.57825: Set connection var ansible_pipelining to False 51385 1727204598.57833: Set connection var ansible_shell_type to sh 51385 1727204598.57850: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204598.57868: Set connection var ansible_timeout to 10 51385 1727204598.57875: Set connection var ansible_connection to ssh 51385 1727204598.57885: Set connection var ansible_shell_executable to /bin/sh 51385 1727204598.57914: variable 'ansible_shell_executable' from source: unknown 51385 1727204598.57925: variable 'ansible_connection' from source: unknown 51385 1727204598.57932: variable 'ansible_module_compression' from source: unknown 51385 1727204598.57938: variable 'ansible_shell_type' from source: unknown 51385 1727204598.57943: variable 'ansible_shell_executable' from source: unknown 51385 1727204598.57949: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204598.57955: variable 'ansible_pipelining' from source: unknown 51385 1727204598.57967: variable 'ansible_timeout' from source: unknown 51385 1727204598.57975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204598.58184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204598.58200: variable 'omit' from source: magic vars 51385 1727204598.58210: starting attempt loop 51385 1727204598.58216: running the handler 51385 1727204598.58234: _low_level_execute_command(): starting 51385 1727204598.58250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204598.59012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.59031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.59048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.59074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.59122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.59135: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.59149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.59172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.59183: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.59194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.59205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.59219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.59240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.59254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.59271: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.59285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.59363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.59383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.59397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.59491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.61035: stdout chunk (state=3): >>>/root <<< 51385 1727204598.61132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.61224: stderr chunk (state=3): >>><<< 51385 1727204598.61228: stdout chunk (state=3): >>><<< 51385 1727204598.61347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.61351: _low_level_execute_command(): starting 51385 1727204598.61354: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997 `" && echo ansible-tmp-1727204598.6125169-52581-200745607776997="` echo /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997 `" ) && sleep 0' 51385 1727204598.61924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.61937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.61950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.61969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.62014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.62027: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.62044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.62063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.62081: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.62093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.62107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.62122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.62138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.62151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.62166: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.62181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.62252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.62272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.62287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.62386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.64228: stdout chunk (state=3): >>>ansible-tmp-1727204598.6125169-52581-200745607776997=/root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997 <<< 51385 1727204598.64331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.64418: stderr chunk (state=3): >>><<< 51385 1727204598.64421: stdout chunk (state=3): >>><<< 51385 1727204598.64511: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204598.6125169-52581-200745607776997=/root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.64515: variable 'ansible_module_compression' from source: unknown 51385 1727204598.64589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 51385 1727204598.64642: variable 'ansible_facts' from source: unknown 51385 1727204598.64715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/AnsiballZ_stat.py 51385 1727204598.64880: Sending initial data 51385 1727204598.64883: Sent initial data (153 bytes) 51385 1727204598.65863: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.65881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.65897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.65916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.65959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.65977: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.65992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.66010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.66022: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.66034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.66049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.66063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.66092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.66106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.66119: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.66134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.66215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.66238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.66255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.66345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.68091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204598.68134: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204598.68190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmprkmk6xan /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/AnsiballZ_stat.py <<< 51385 1727204598.68240: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204598.69334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.69446: stderr chunk (state=3): >>><<< 51385 1727204598.69450: stdout chunk (state=3): >>><<< 51385 1727204598.69471: done transferring module to remote 51385 1727204598.69482: _low_level_execute_command(): starting 51385 1727204598.69488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/ /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/AnsiballZ_stat.py && sleep 0' 51385 1727204598.70210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.70214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.70249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.70253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.70306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.70309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.70373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.72091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.72158: stderr chunk (state=3): >>><<< 51385 1727204598.72161: stdout chunk (state=3): >>><<< 51385 1727204598.72261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.72267: _low_level_execute_command(): starting 51385 1727204598.72269: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/AnsiballZ_stat.py && sleep 0' 51385 1727204598.73329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.73335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.73380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204598.73383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.73386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204598.73389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.73451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.73454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.73459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.73527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.86546: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31143, "dev": 21, "nlink": 1, "atime": 1727204597.6267304, "mtime": 1727204597.6267304, "ctime": 1727204597.6267304, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 51385 1727204598.87540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204598.87544: stdout chunk (state=3): >>><<< 51385 1727204598.87546: stderr chunk (state=3): >>><<< 51385 1727204598.87671: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31143, "dev": 21, "nlink": 1, "atime": 1727204597.6267304, "mtime": 1727204597.6267304, "ctime": 1727204597.6267304, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204598.87679: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204598.87681: _low_level_execute_command(): starting 51385 1727204598.87683: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204598.6125169-52581-200745607776997/ > /dev/null 2>&1 && sleep 0' 51385 1727204598.88341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204598.88362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.88383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.88423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.88481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.88494: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204598.88509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.88527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204598.88539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204598.88551: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204598.88569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204598.88588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204598.88604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204598.88616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204598.88628: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204598.88646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204598.88728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204598.88752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204598.88772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204598.88871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204598.90703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204598.90707: stdout chunk (state=3): >>><<< 51385 1727204598.90709: stderr chunk (state=3): >>><<< 51385 1727204598.90769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204598.90773: handler run complete 51385 1727204598.91082: attempt loop complete, returning result 51385 1727204598.91085: _execute() done 51385 1727204598.91088: dumping result to json 51385 1727204598.91090: done dumping result, returning 51385 1727204598.91092: done running TaskExecutor() for managed-node1/TASK: Get stat for interface lsr101.90 [0affcd87-79f5-6b1f-5706-00000000069c] 51385 1727204598.91094: sending task result for task 0affcd87-79f5-6b1f-5706-00000000069c 51385 1727204598.91178: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000069c 51385 1727204598.91181: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204597.6267304, "block_size": 4096, "blocks": 0, "ctime": 1727204597.6267304, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31143, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1727204597.6267304, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 51385 1727204598.91279: no more pending results, returning what we have 51385 1727204598.91283: results queue empty 51385 1727204598.91284: checking for any_errors_fatal 51385 1727204598.91286: done checking for any_errors_fatal 51385 1727204598.91287: checking for max_fail_percentage 51385 1727204598.91289: done checking for max_fail_percentage 51385 1727204598.91290: checking to see if all hosts have failed and the running result is not ok 51385 1727204598.91291: done checking to see if all hosts have failed 51385 1727204598.91291: getting the remaining hosts for this loop 51385 1727204598.91293: done getting the remaining hosts for this loop 51385 1727204598.91297: getting the next task for host managed-node1 51385 1727204598.91305: done getting next task for host managed-node1 51385 1727204598.91308: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 51385 1727204598.91311: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204598.91315: getting variables 51385 1727204598.91317: in VariableManager get_vars() 51385 1727204598.91363: Calling all_inventory to load vars for managed-node1 51385 1727204598.91372: Calling groups_inventory to load vars for managed-node1 51385 1727204598.91375: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204598.91386: Calling all_plugins_play to load vars for managed-node1 51385 1727204598.91389: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204598.91392: Calling groups_plugins_play to load vars for managed-node1 51385 1727204598.97780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.02998: done with get_vars() 51385 1727204599.03036: done getting variables 51385 1727204599.03103: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204599.03247: variable 'interface' from source: include params 51385 1727204599.03251: variable 'vlan_interface' from source: play vars 51385 1727204599.03515: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.477) 0:00:17.439 ***** 51385 1727204599.03543: entering _queue_task() for managed-node1/assert 51385 1727204599.04277: worker is 1 (out of 1 available) 51385 1727204599.04290: exiting _queue_task() for managed-node1/assert 51385 1727204599.04301: done queuing things up, now waiting for results queue to drain 51385 1727204599.04302: waiting for pending results... 51385 1727204599.05030: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'lsr101.90' 51385 1727204599.05154: in run() - task 0affcd87-79f5-6b1f-5706-000000000579 51385 1727204599.05174: variable 'ansible_search_path' from source: unknown 51385 1727204599.05182: variable 'ansible_search_path' from source: unknown 51385 1727204599.05226: calling self._execute() 51385 1727204599.05331: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.05341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.05352: variable 'omit' from source: magic vars 51385 1727204599.05713: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.05729: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.05738: variable 'omit' from source: magic vars 51385 1727204599.05783: variable 'omit' from source: magic vars 51385 1727204599.05885: variable 'interface' from source: include params 51385 1727204599.05893: variable 'vlan_interface' from source: play vars 51385 1727204599.05960: variable 'vlan_interface' from source: play vars 51385 1727204599.05986: variable 'omit' from source: magic vars 51385 1727204599.06030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204599.06070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204599.06100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204599.06127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.06143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.06175: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204599.06186: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.06199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.06307: Set connection var ansible_pipelining to False 51385 1727204599.06316: Set connection var ansible_shell_type to sh 51385 1727204599.06330: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204599.06346: Set connection var ansible_timeout to 10 51385 1727204599.06352: Set connection var ansible_connection to ssh 51385 1727204599.06361: Set connection var ansible_shell_executable to /bin/sh 51385 1727204599.06388: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.06394: variable 'ansible_connection' from source: unknown 51385 1727204599.06400: variable 'ansible_module_compression' from source: unknown 51385 1727204599.06410: variable 'ansible_shell_type' from source: unknown 51385 1727204599.06416: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.06422: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.06428: variable 'ansible_pipelining' from source: unknown 51385 1727204599.06434: variable 'ansible_timeout' from source: unknown 51385 1727204599.06446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.06588: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204599.06609: variable 'omit' from source: magic vars 51385 1727204599.07415: starting attempt loop 51385 1727204599.07423: running the handler 51385 1727204599.07621: variable 'interface_stat' from source: set_fact 51385 1727204599.07761: Evaluated conditional (interface_stat.stat.exists): True 51385 1727204599.07774: handler run complete 51385 1727204599.07866: attempt loop complete, returning result 51385 1727204599.07875: _execute() done 51385 1727204599.07882: dumping result to json 51385 1727204599.07889: done dumping result, returning 51385 1727204599.07900: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'lsr101.90' [0affcd87-79f5-6b1f-5706-000000000579] 51385 1727204599.07910: sending task result for task 0affcd87-79f5-6b1f-5706-000000000579 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204599.08073: no more pending results, returning what we have 51385 1727204599.08077: results queue empty 51385 1727204599.08079: checking for any_errors_fatal 51385 1727204599.08091: done checking for any_errors_fatal 51385 1727204599.08092: checking for max_fail_percentage 51385 1727204599.08094: done checking for max_fail_percentage 51385 1727204599.08095: checking to see if all hosts have failed and the running result is not ok 51385 1727204599.08096: done checking to see if all hosts have failed 51385 1727204599.08097: getting the remaining hosts for this loop 51385 1727204599.08099: done getting the remaining hosts for this loop 51385 1727204599.08103: getting the next task for host managed-node1 51385 1727204599.08112: done getting next task for host managed-node1 51385 1727204599.08115: ^ task is: TASK: Include the task 'assert_profile_present.yml' 51385 1727204599.08117: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204599.08121: getting variables 51385 1727204599.08123: in VariableManager get_vars() 51385 1727204599.08172: Calling all_inventory to load vars for managed-node1 51385 1727204599.08175: Calling groups_inventory to load vars for managed-node1 51385 1727204599.08178: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.08189: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.08191: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.08194: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.09270: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000579 51385 1727204599.09276: WORKER PROCESS EXITING 51385 1727204599.12254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.15758: done with get_vars() 51385 1727204599.15788: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.123) 0:00:17.563 ***** 51385 1727204599.15910: entering _queue_task() for managed-node1/include_tasks 51385 1727204599.16789: worker is 1 (out of 1 available) 51385 1727204599.16824: exiting _queue_task() for managed-node1/include_tasks 51385 1727204599.16837: done queuing things up, now waiting for results queue to drain 51385 1727204599.16839: waiting for pending results... 51385 1727204599.17579: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 51385 1727204599.17880: in run() - task 0affcd87-79f5-6b1f-5706-00000000005c 51385 1727204599.17971: variable 'ansible_search_path' from source: unknown 51385 1727204599.18027: variable 'interface' from source: play vars 51385 1727204599.18484: variable 'interface' from source: play vars 51385 1727204599.18599: variable 'vlan_interface' from source: play vars 51385 1727204599.18700: variable 'vlan_interface' from source: play vars 51385 1727204599.18955: variable 'omit' from source: magic vars 51385 1727204599.19315: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.19341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.19360: variable 'omit' from source: magic vars 51385 1727204599.19636: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.19656: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.19699: variable 'item' from source: unknown 51385 1727204599.19772: variable 'item' from source: unknown 51385 1727204599.20019: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.20156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.20174: variable 'omit' from source: magic vars 51385 1727204599.20690: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.20701: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.20731: variable 'item' from source: unknown 51385 1727204599.20810: variable 'item' from source: unknown 51385 1727204599.20909: dumping result to json 51385 1727204599.20920: done dumping result, returning 51385 1727204599.20931: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-6b1f-5706-00000000005c] 51385 1727204599.20942: sending task result for task 0affcd87-79f5-6b1f-5706-00000000005c 51385 1727204599.21045: no more pending results, returning what we have 51385 1727204599.21052: in VariableManager get_vars() 51385 1727204599.21120: Calling all_inventory to load vars for managed-node1 51385 1727204599.21125: Calling groups_inventory to load vars for managed-node1 51385 1727204599.21128: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.21142: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.21145: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.21148: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.22258: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000005c 51385 1727204599.22262: WORKER PROCESS EXITING 51385 1727204599.23916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.25981: done with get_vars() 51385 1727204599.26008: variable 'ansible_search_path' from source: unknown 51385 1727204599.26026: variable 'ansible_search_path' from source: unknown 51385 1727204599.26034: we have included files to process 51385 1727204599.26035: generating all_blocks data 51385 1727204599.26038: done generating all_blocks data 51385 1727204599.26043: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26044: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26046: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26258: in VariableManager get_vars() 51385 1727204599.26291: done with get_vars() 51385 1727204599.26559: done processing included file 51385 1727204599.26562: iterating over new_blocks loaded from include file 51385 1727204599.26565: in VariableManager get_vars() 51385 1727204599.26584: done with get_vars() 51385 1727204599.26586: filtering new block on tags 51385 1727204599.26613: done filtering new block on tags 51385 1727204599.26615: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=lsr101) 51385 1727204599.26621: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26622: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26625: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 51385 1727204599.26725: in VariableManager get_vars() 51385 1727204599.26743: done with get_vars() 51385 1727204599.26949: done processing included file 51385 1727204599.26951: iterating over new_blocks loaded from include file 51385 1727204599.26952: in VariableManager get_vars() 51385 1727204599.26970: done with get_vars() 51385 1727204599.26972: filtering new block on tags 51385 1727204599.26990: done filtering new block on tags 51385 1727204599.26992: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=lsr101.90) 51385 1727204599.26996: extending task lists for all hosts with included blocks 51385 1727204599.29824: done extending task lists 51385 1727204599.29825: done processing included files 51385 1727204599.29826: results queue empty 51385 1727204599.29827: checking for any_errors_fatal 51385 1727204599.29830: done checking for any_errors_fatal 51385 1727204599.29831: checking for max_fail_percentage 51385 1727204599.29832: done checking for max_fail_percentage 51385 1727204599.29833: checking to see if all hosts have failed and the running result is not ok 51385 1727204599.29834: done checking to see if all hosts have failed 51385 1727204599.29834: getting the remaining hosts for this loop 51385 1727204599.29836: done getting the remaining hosts for this loop 51385 1727204599.29838: getting the next task for host managed-node1 51385 1727204599.29842: done getting next task for host managed-node1 51385 1727204599.29844: ^ task is: TASK: Include the task 'get_profile_stat.yml' 51385 1727204599.29847: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204599.29849: getting variables 51385 1727204599.29850: in VariableManager get_vars() 51385 1727204599.29871: Calling all_inventory to load vars for managed-node1 51385 1727204599.29875: Calling groups_inventory to load vars for managed-node1 51385 1727204599.29878: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.29884: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.29886: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.29895: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.31308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.33184: done with get_vars() 51385 1727204599.33214: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.173) 0:00:17.737 ***** 51385 1727204599.33310: entering _queue_task() for managed-node1/include_tasks 51385 1727204599.33727: worker is 1 (out of 1 available) 51385 1727204599.33743: exiting _queue_task() for managed-node1/include_tasks 51385 1727204599.33757: done queuing things up, now waiting for results queue to drain 51385 1727204599.33758: waiting for pending results... 51385 1727204599.34099: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 51385 1727204599.34229: in run() - task 0affcd87-79f5-6b1f-5706-0000000006b8 51385 1727204599.34251: variable 'ansible_search_path' from source: unknown 51385 1727204599.34335: variable 'ansible_search_path' from source: unknown 51385 1727204599.34384: calling self._execute() 51385 1727204599.34606: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.34619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.34692: variable 'omit' from source: magic vars 51385 1727204599.35098: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.35124: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.35134: _execute() done 51385 1727204599.35142: dumping result to json 51385 1727204599.35150: done dumping result, returning 51385 1727204599.35159: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-6b1f-5706-0000000006b8] 51385 1727204599.35174: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006b8 51385 1727204599.35314: no more pending results, returning what we have 51385 1727204599.35319: in VariableManager get_vars() 51385 1727204599.35375: Calling all_inventory to load vars for managed-node1 51385 1727204599.35379: Calling groups_inventory to load vars for managed-node1 51385 1727204599.35382: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.35399: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.35402: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.35406: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.36593: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006b8 51385 1727204599.36597: WORKER PROCESS EXITING 51385 1727204599.37848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.41690: done with get_vars() 51385 1727204599.41717: variable 'ansible_search_path' from source: unknown 51385 1727204599.41719: variable 'ansible_search_path' from source: unknown 51385 1727204599.41759: we have included files to process 51385 1727204599.41760: generating all_blocks data 51385 1727204599.41762: done generating all_blocks data 51385 1727204599.41878: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204599.41881: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204599.41885: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204599.44330: done processing included file 51385 1727204599.44333: iterating over new_blocks loaded from include file 51385 1727204599.44334: in VariableManager get_vars() 51385 1727204599.44357: done with get_vars() 51385 1727204599.44359: filtering new block on tags 51385 1727204599.44500: done filtering new block on tags 51385 1727204599.44503: in VariableManager get_vars() 51385 1727204599.44525: done with get_vars() 51385 1727204599.44526: filtering new block on tags 51385 1727204599.44549: done filtering new block on tags 51385 1727204599.44551: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 51385 1727204599.44555: extending task lists for all hosts with included blocks 51385 1727204599.44971: done extending task lists 51385 1727204599.44972: done processing included files 51385 1727204599.44973: results queue empty 51385 1727204599.44974: checking for any_errors_fatal 51385 1727204599.44978: done checking for any_errors_fatal 51385 1727204599.44978: checking for max_fail_percentage 51385 1727204599.44980: done checking for max_fail_percentage 51385 1727204599.44980: checking to see if all hosts have failed and the running result is not ok 51385 1727204599.44981: done checking to see if all hosts have failed 51385 1727204599.44982: getting the remaining hosts for this loop 51385 1727204599.44983: done getting the remaining hosts for this loop 51385 1727204599.44985: getting the next task for host managed-node1 51385 1727204599.44989: done getting next task for host managed-node1 51385 1727204599.44992: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 51385 1727204599.44995: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204599.44997: getting variables 51385 1727204599.44998: in VariableManager get_vars() 51385 1727204599.45196: Calling all_inventory to load vars for managed-node1 51385 1727204599.45199: Calling groups_inventory to load vars for managed-node1 51385 1727204599.45201: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.45207: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.45209: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.45212: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.47761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.52660: done with get_vars() 51385 1727204599.52695: done getting variables 51385 1727204599.52857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.195) 0:00:17.932 ***** 51385 1727204599.52887: entering _queue_task() for managed-node1/set_fact 51385 1727204599.53628: worker is 1 (out of 1 available) 51385 1727204599.53643: exiting _queue_task() for managed-node1/set_fact 51385 1727204599.53658: done queuing things up, now waiting for results queue to drain 51385 1727204599.53660: waiting for pending results... 51385 1727204599.54566: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 51385 1727204599.54931: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f0 51385 1727204599.55043: variable 'ansible_search_path' from source: unknown 51385 1727204599.55053: variable 'ansible_search_path' from source: unknown 51385 1727204599.55096: calling self._execute() 51385 1727204599.55314: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.55326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.55344: variable 'omit' from source: magic vars 51385 1727204599.56091: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.56242: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.56254: variable 'omit' from source: magic vars 51385 1727204599.56307: variable 'omit' from source: magic vars 51385 1727204599.56378: variable 'omit' from source: magic vars 51385 1727204599.56489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204599.56595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204599.56622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204599.56684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.56786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.56822: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204599.56832: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.56842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.57068: Set connection var ansible_pipelining to False 51385 1727204599.57077: Set connection var ansible_shell_type to sh 51385 1727204599.57095: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204599.57225: Set connection var ansible_timeout to 10 51385 1727204599.57233: Set connection var ansible_connection to ssh 51385 1727204599.57245: Set connection var ansible_shell_executable to /bin/sh 51385 1727204599.57275: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.57284: variable 'ansible_connection' from source: unknown 51385 1727204599.57292: variable 'ansible_module_compression' from source: unknown 51385 1727204599.57300: variable 'ansible_shell_type' from source: unknown 51385 1727204599.57307: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.57315: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.57328: variable 'ansible_pipelining' from source: unknown 51385 1727204599.57335: variable 'ansible_timeout' from source: unknown 51385 1727204599.57343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.57716: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204599.57776: variable 'omit' from source: magic vars 51385 1727204599.57787: starting attempt loop 51385 1727204599.57794: running the handler 51385 1727204599.57811: handler run complete 51385 1727204599.57885: attempt loop complete, returning result 51385 1727204599.57892: _execute() done 51385 1727204599.57899: dumping result to json 51385 1727204599.57906: done dumping result, returning 51385 1727204599.57916: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-6b1f-5706-0000000007f0] 51385 1727204599.57925: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f0 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 51385 1727204599.58145: no more pending results, returning what we have 51385 1727204599.58149: results queue empty 51385 1727204599.58150: checking for any_errors_fatal 51385 1727204599.58152: done checking for any_errors_fatal 51385 1727204599.58153: checking for max_fail_percentage 51385 1727204599.58155: done checking for max_fail_percentage 51385 1727204599.58156: checking to see if all hosts have failed and the running result is not ok 51385 1727204599.58157: done checking to see if all hosts have failed 51385 1727204599.58158: getting the remaining hosts for this loop 51385 1727204599.58160: done getting the remaining hosts for this loop 51385 1727204599.58166: getting the next task for host managed-node1 51385 1727204599.58175: done getting next task for host managed-node1 51385 1727204599.58178: ^ task is: TASK: Stat profile file 51385 1727204599.58183: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204599.58191: getting variables 51385 1727204599.58193: in VariableManager get_vars() 51385 1727204599.58247: Calling all_inventory to load vars for managed-node1 51385 1727204599.58250: Calling groups_inventory to load vars for managed-node1 51385 1727204599.58253: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204599.58272: Calling all_plugins_play to load vars for managed-node1 51385 1727204599.58276: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204599.58280: Calling groups_plugins_play to load vars for managed-node1 51385 1727204599.59473: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f0 51385 1727204599.59478: WORKER PROCESS EXITING 51385 1727204599.61553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204599.65235: done with get_vars() 51385 1727204599.65275: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.126) 0:00:18.058 ***** 51385 1727204599.65507: entering _queue_task() for managed-node1/stat 51385 1727204599.66190: worker is 1 (out of 1 available) 51385 1727204599.66203: exiting _queue_task() for managed-node1/stat 51385 1727204599.66328: done queuing things up, now waiting for results queue to drain 51385 1727204599.66330: waiting for pending results... 51385 1727204599.67282: running TaskExecutor() for managed-node1/TASK: Stat profile file 51385 1727204599.67543: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f1 51385 1727204599.67656: variable 'ansible_search_path' from source: unknown 51385 1727204599.67668: variable 'ansible_search_path' from source: unknown 51385 1727204599.67713: calling self._execute() 51385 1727204599.67827: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.68073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.68093: variable 'omit' from source: magic vars 51385 1727204599.68792: variable 'ansible_distribution_major_version' from source: facts 51385 1727204599.68939: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204599.68955: variable 'omit' from source: magic vars 51385 1727204599.69012: variable 'omit' from source: magic vars 51385 1727204599.69228: variable 'profile' from source: include params 51385 1727204599.69239: variable 'item' from source: include params 51385 1727204599.69330: variable 'item' from source: include params 51385 1727204599.69504: variable 'omit' from source: magic vars 51385 1727204599.69553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204599.69622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204599.69709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204599.69731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.69784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204599.69935: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204599.69945: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.69953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.70176: Set connection var ansible_pipelining to False 51385 1727204599.70185: Set connection var ansible_shell_type to sh 51385 1727204599.70200: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204599.70213: Set connection var ansible_timeout to 10 51385 1727204599.70220: Set connection var ansible_connection to ssh 51385 1727204599.70237: Set connection var ansible_shell_executable to /bin/sh 51385 1727204599.70270: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.70349: variable 'ansible_connection' from source: unknown 51385 1727204599.70360: variable 'ansible_module_compression' from source: unknown 51385 1727204599.70370: variable 'ansible_shell_type' from source: unknown 51385 1727204599.70377: variable 'ansible_shell_executable' from source: unknown 51385 1727204599.70384: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204599.70392: variable 'ansible_pipelining' from source: unknown 51385 1727204599.70399: variable 'ansible_timeout' from source: unknown 51385 1727204599.70407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204599.70843: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204599.71007: variable 'omit' from source: magic vars 51385 1727204599.71018: starting attempt loop 51385 1727204599.71026: running the handler 51385 1727204599.71046: _low_level_execute_command(): starting 51385 1727204599.71060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204599.73064: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204599.73189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.73207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.73226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.73276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.73404: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204599.73422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.73442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204599.73455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204599.73470: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204599.73484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.73503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.73524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.73539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.73552: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204599.73573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.73749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204599.73777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204599.73793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204599.73892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204599.75541: stdout chunk (state=3): >>>/root <<< 51385 1727204599.75744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204599.75747: stdout chunk (state=3): >>><<< 51385 1727204599.75750: stderr chunk (state=3): >>><<< 51385 1727204599.75872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204599.75875: _low_level_execute_command(): starting 51385 1727204599.75879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224 `" && echo ansible-tmp-1727204599.7577507-52763-133693020052224="` echo /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224 `" ) && sleep 0' 51385 1727204599.77298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204599.77312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.77327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.77344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.77394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.77407: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204599.77420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.77437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204599.77449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204599.77466: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204599.77482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.77513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.77528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.77540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.77550: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204599.77567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.77680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204599.77730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204599.77838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204599.78014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204599.79875: stdout chunk (state=3): >>>ansible-tmp-1727204599.7577507-52763-133693020052224=/root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224 <<< 51385 1727204599.80080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204599.80083: stdout chunk (state=3): >>><<< 51385 1727204599.80086: stderr chunk (state=3): >>><<< 51385 1727204599.80371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204599.7577507-52763-133693020052224=/root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204599.80374: variable 'ansible_module_compression' from source: unknown 51385 1727204599.80376: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 51385 1727204599.80378: variable 'ansible_facts' from source: unknown 51385 1727204599.80380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/AnsiballZ_stat.py 51385 1727204599.81390: Sending initial data 51385 1727204599.81393: Sent initial data (153 bytes) 51385 1727204599.83831: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.83838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.83977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204599.83980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204599.84083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.84086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204599.84089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.84145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204599.84187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204599.84190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204599.84258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204599.85987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204599.86036: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204599.86088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpx1vrdd9v /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/AnsiballZ_stat.py <<< 51385 1727204599.86163: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204599.87542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204599.87804: stderr chunk (state=3): >>><<< 51385 1727204599.87808: stdout chunk (state=3): >>><<< 51385 1727204599.87810: done transferring module to remote 51385 1727204599.87813: _low_level_execute_command(): starting 51385 1727204599.87815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/ /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/AnsiballZ_stat.py && sleep 0' 51385 1727204599.89291: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204599.89305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.89324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.89345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.89392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.89403: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204599.89416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.89432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204599.89453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204599.89477: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204599.89487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.89498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.89510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.89519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.89561: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204599.89578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.89655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204599.89783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204599.89798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204599.89998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204599.91753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204599.91757: stdout chunk (state=3): >>><<< 51385 1727204599.91759: stderr chunk (state=3): >>><<< 51385 1727204599.91852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204599.91855: _low_level_execute_command(): starting 51385 1727204599.91857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/AnsiballZ_stat.py && sleep 0' 51385 1727204599.92720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204599.92734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.92751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.92775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.92817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.92830: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204599.92843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.92865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204599.92877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204599.92887: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204599.92898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204599.92909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204599.92924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204599.92935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204599.92945: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204599.92957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204599.93037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204599.93054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204599.93070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204599.93170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.06180: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 51385 1727204600.07085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204600.07161: stderr chunk (state=3): >>><<< 51385 1727204600.07167: stdout chunk (state=3): >>><<< 51385 1727204600.07189: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204600.07221: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204600.07229: _low_level_execute_command(): starting 51385 1727204600.07231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204599.7577507-52763-133693020052224/ > /dev/null 2>&1 && sleep 0' 51385 1727204600.07892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204600.07900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.07912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.07925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.07970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.07979: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204600.07989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.08002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204600.08010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204600.08017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204600.08027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.08035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.08047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.08054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.08061: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204600.08077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.08154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204600.08161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.08170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.08259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.10010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.10097: stderr chunk (state=3): >>><<< 51385 1727204600.10114: stdout chunk (state=3): >>><<< 51385 1727204600.10375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204600.10379: handler run complete 51385 1727204600.10382: attempt loop complete, returning result 51385 1727204600.10384: _execute() done 51385 1727204600.10386: dumping result to json 51385 1727204600.10388: done dumping result, returning 51385 1727204600.10390: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-6b1f-5706-0000000007f1] 51385 1727204600.10392: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f1 51385 1727204600.10465: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f1 51385 1727204600.10469: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 51385 1727204600.10556: no more pending results, returning what we have 51385 1727204600.10560: results queue empty 51385 1727204600.10561: checking for any_errors_fatal 51385 1727204600.10569: done checking for any_errors_fatal 51385 1727204600.10570: checking for max_fail_percentage 51385 1727204600.10572: done checking for max_fail_percentage 51385 1727204600.10573: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.10574: done checking to see if all hosts have failed 51385 1727204600.10575: getting the remaining hosts for this loop 51385 1727204600.10577: done getting the remaining hosts for this loop 51385 1727204600.10581: getting the next task for host managed-node1 51385 1727204600.10588: done getting next task for host managed-node1 51385 1727204600.10591: ^ task is: TASK: Set NM profile exist flag based on the profile files 51385 1727204600.10595: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.10602: getting variables 51385 1727204600.10604: in VariableManager get_vars() 51385 1727204600.10650: Calling all_inventory to load vars for managed-node1 51385 1727204600.10653: Calling groups_inventory to load vars for managed-node1 51385 1727204600.10656: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.10670: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.10673: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.10677: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.12542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.14462: done with get_vars() 51385 1727204600.14499: done getting variables 51385 1727204600.14590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.491) 0:00:18.550 ***** 51385 1727204600.14640: entering _queue_task() for managed-node1/set_fact 51385 1727204600.15110: worker is 1 (out of 1 available) 51385 1727204600.15123: exiting _queue_task() for managed-node1/set_fact 51385 1727204600.15143: done queuing things up, now waiting for results queue to drain 51385 1727204600.15144: waiting for pending results... 51385 1727204600.15456: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 51385 1727204600.15591: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f2 51385 1727204600.15613: variable 'ansible_search_path' from source: unknown 51385 1727204600.15621: variable 'ansible_search_path' from source: unknown 51385 1727204600.15665: calling self._execute() 51385 1727204600.15774: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.15788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.15810: variable 'omit' from source: magic vars 51385 1727204600.16230: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.16258: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.16400: variable 'profile_stat' from source: set_fact 51385 1727204600.16421: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204600.16428: when evaluation is False, skipping this task 51385 1727204600.16438: _execute() done 51385 1727204600.16452: dumping result to json 51385 1727204600.16469: done dumping result, returning 51385 1727204600.16481: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-6b1f-5706-0000000007f2] 51385 1727204600.16491: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f2 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204600.16641: no more pending results, returning what we have 51385 1727204600.16647: results queue empty 51385 1727204600.16648: checking for any_errors_fatal 51385 1727204600.16657: done checking for any_errors_fatal 51385 1727204600.16657: checking for max_fail_percentage 51385 1727204600.16659: done checking for max_fail_percentage 51385 1727204600.16660: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.16661: done checking to see if all hosts have failed 51385 1727204600.16662: getting the remaining hosts for this loop 51385 1727204600.16665: done getting the remaining hosts for this loop 51385 1727204600.16670: getting the next task for host managed-node1 51385 1727204600.16677: done getting next task for host managed-node1 51385 1727204600.16680: ^ task is: TASK: Get NM profile info 51385 1727204600.16685: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.16691: getting variables 51385 1727204600.16692: in VariableManager get_vars() 51385 1727204600.16739: Calling all_inventory to load vars for managed-node1 51385 1727204600.16741: Calling groups_inventory to load vars for managed-node1 51385 1727204600.16744: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.16758: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.16761: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.16766: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.18041: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f2 51385 1727204600.18045: WORKER PROCESS EXITING 51385 1727204600.18056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.19073: done with get_vars() 51385 1727204600.19098: done getting variables 51385 1727204600.19239: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.046) 0:00:18.596 ***** 51385 1727204600.19281: entering _queue_task() for managed-node1/shell 51385 1727204600.19283: Creating lock for shell 51385 1727204600.19622: worker is 1 (out of 1 available) 51385 1727204600.19636: exiting _queue_task() for managed-node1/shell 51385 1727204600.19649: done queuing things up, now waiting for results queue to drain 51385 1727204600.19650: waiting for pending results... 51385 1727204600.20014: running TaskExecutor() for managed-node1/TASK: Get NM profile info 51385 1727204600.20186: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f3 51385 1727204600.20214: variable 'ansible_search_path' from source: unknown 51385 1727204600.20221: variable 'ansible_search_path' from source: unknown 51385 1727204600.20269: calling self._execute() 51385 1727204600.20357: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.20366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.20376: variable 'omit' from source: magic vars 51385 1727204600.20661: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.20677: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.20683: variable 'omit' from source: magic vars 51385 1727204600.20718: variable 'omit' from source: magic vars 51385 1727204600.20793: variable 'profile' from source: include params 51385 1727204600.20796: variable 'item' from source: include params 51385 1727204600.20846: variable 'item' from source: include params 51385 1727204600.20862: variable 'omit' from source: magic vars 51385 1727204600.20902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204600.20930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204600.20947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204600.20962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.20975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.21002: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204600.21006: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.21009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.21082: Set connection var ansible_pipelining to False 51385 1727204600.21085: Set connection var ansible_shell_type to sh 51385 1727204600.21093: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204600.21099: Set connection var ansible_timeout to 10 51385 1727204600.21101: Set connection var ansible_connection to ssh 51385 1727204600.21106: Set connection var ansible_shell_executable to /bin/sh 51385 1727204600.21124: variable 'ansible_shell_executable' from source: unknown 51385 1727204600.21127: variable 'ansible_connection' from source: unknown 51385 1727204600.21130: variable 'ansible_module_compression' from source: unknown 51385 1727204600.21132: variable 'ansible_shell_type' from source: unknown 51385 1727204600.21136: variable 'ansible_shell_executable' from source: unknown 51385 1727204600.21138: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.21140: variable 'ansible_pipelining' from source: unknown 51385 1727204600.21143: variable 'ansible_timeout' from source: unknown 51385 1727204600.21148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.21253: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204600.21267: variable 'omit' from source: magic vars 51385 1727204600.21270: starting attempt loop 51385 1727204600.21274: running the handler 51385 1727204600.21285: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204600.21301: _low_level_execute_command(): starting 51385 1727204600.21308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204600.21837: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.21847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.21877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204600.21893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204600.21905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.21953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204600.21957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.21973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.22042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.23628: stdout chunk (state=3): >>>/root <<< 51385 1727204600.23741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.23794: stderr chunk (state=3): >>><<< 51385 1727204600.23798: stdout chunk (state=3): >>><<< 51385 1727204600.23819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204600.23831: _low_level_execute_command(): starting 51385 1727204600.23837: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631 `" && echo ansible-tmp-1727204600.238203-52794-215214756096631="` echo /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631 `" ) && sleep 0' 51385 1727204600.24297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204600.24301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.24313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.24344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.24350: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204600.24360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.24373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204600.24380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.24385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.24397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.24401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.24408: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204600.24415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.24475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204600.24483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.24488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.24566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.26408: stdout chunk (state=3): >>>ansible-tmp-1727204600.238203-52794-215214756096631=/root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631 <<< 51385 1727204600.26586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.26695: stderr chunk (state=3): >>><<< 51385 1727204600.26698: stdout chunk (state=3): >>><<< 51385 1727204600.26707: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204600.238203-52794-215214756096631=/root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204600.26781: variable 'ansible_module_compression' from source: unknown 51385 1727204600.26999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204600.27037: variable 'ansible_facts' from source: unknown 51385 1727204600.27126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/AnsiballZ_command.py 51385 1727204600.27976: Sending initial data 51385 1727204600.27979: Sent initial data (155 bytes) 51385 1727204600.28989: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.29007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.29049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204600.29054: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.29076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.29096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204600.29101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.29230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.29250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.29332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.31026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204600.31091: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204600.31151: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpujsonvst /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/AnsiballZ_command.py <<< 51385 1727204600.31209: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204600.32378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.32480: stderr chunk (state=3): >>><<< 51385 1727204600.32483: stdout chunk (state=3): >>><<< 51385 1727204600.32499: done transferring module to remote 51385 1727204600.32511: _low_level_execute_command(): starting 51385 1727204600.32516: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/ /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/AnsiballZ_command.py && sleep 0' 51385 1727204600.32956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.32967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.32994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.33002: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204600.33010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.33020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204600.33027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204600.33033: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.33040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204600.33052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.33056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204600.33072: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204600.33075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.33120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204600.33141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.33144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.33201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.34911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.34962: stderr chunk (state=3): >>><<< 51385 1727204600.34968: stdout chunk (state=3): >>><<< 51385 1727204600.34986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204600.34989: _low_level_execute_command(): starting 51385 1727204600.34998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/AnsiballZ_command.py && sleep 0' 51385 1727204600.35450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.35455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.35489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.35502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.35557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204600.35568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.35639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.50829: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 15:03:20.486079", "end": "2024-09-24 15:03:20.507648", "delta": "0:00:00.021569", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204600.51899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204600.51957: stderr chunk (state=3): >>><<< 51385 1727204600.51963: stdout chunk (state=3): >>><<< 51385 1727204600.51980: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 15:03:20.486079", "end": "2024-09-24 15:03:20.507648", "delta": "0:00:00.021569", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204600.52007: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204600.52016: _low_level_execute_command(): starting 51385 1727204600.52018: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204600.238203-52794-215214756096631/ > /dev/null 2>&1 && sleep 0' 51385 1727204600.52483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204600.52494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204600.52522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.52534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204600.52583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204600.52595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204600.52670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204600.54438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204600.54523: stderr chunk (state=3): >>><<< 51385 1727204600.54527: stdout chunk (state=3): >>><<< 51385 1727204600.54556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204600.54567: handler run complete 51385 1727204600.54589: Evaluated conditional (False): False 51385 1727204600.54599: attempt loop complete, returning result 51385 1727204600.54602: _execute() done 51385 1727204600.54604: dumping result to json 51385 1727204600.54610: done dumping result, returning 51385 1727204600.54618: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-6b1f-5706-0000000007f3] 51385 1727204600.54629: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f3 51385 1727204600.54740: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f3 51385 1727204600.54743: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.021569", "end": "2024-09-24 15:03:20.507648", "rc": 0, "start": "2024-09-24 15:03:20.486079" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 51385 1727204600.54824: no more pending results, returning what we have 51385 1727204600.54828: results queue empty 51385 1727204600.54830: checking for any_errors_fatal 51385 1727204600.54835: done checking for any_errors_fatal 51385 1727204600.54836: checking for max_fail_percentage 51385 1727204600.54841: done checking for max_fail_percentage 51385 1727204600.54842: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.54843: done checking to see if all hosts have failed 51385 1727204600.54844: getting the remaining hosts for this loop 51385 1727204600.54846: done getting the remaining hosts for this loop 51385 1727204600.54849: getting the next task for host managed-node1 51385 1727204600.54857: done getting next task for host managed-node1 51385 1727204600.54859: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 51385 1727204600.54865: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.54871: getting variables 51385 1727204600.54873: in VariableManager get_vars() 51385 1727204600.54918: Calling all_inventory to load vars for managed-node1 51385 1727204600.54921: Calling groups_inventory to load vars for managed-node1 51385 1727204600.54924: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.54935: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.54938: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.54941: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.56740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.58783: done with get_vars() 51385 1727204600.58812: done getting variables 51385 1727204600.58884: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.396) 0:00:18.993 ***** 51385 1727204600.58918: entering _queue_task() for managed-node1/set_fact 51385 1727204600.59613: worker is 1 (out of 1 available) 51385 1727204600.59628: exiting _queue_task() for managed-node1/set_fact 51385 1727204600.59644: done queuing things up, now waiting for results queue to drain 51385 1727204600.59645: waiting for pending results... 51385 1727204600.60450: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 51385 1727204600.60581: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f4 51385 1727204600.60712: variable 'ansible_search_path' from source: unknown 51385 1727204600.60719: variable 'ansible_search_path' from source: unknown 51385 1727204600.60767: calling self._execute() 51385 1727204600.61100: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.61174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.61192: variable 'omit' from source: magic vars 51385 1727204600.61661: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.61681: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.61824: variable 'nm_profile_exists' from source: set_fact 51385 1727204600.61847: Evaluated conditional (nm_profile_exists.rc == 0): True 51385 1727204600.61857: variable 'omit' from source: magic vars 51385 1727204600.61914: variable 'omit' from source: magic vars 51385 1727204600.61953: variable 'omit' from source: magic vars 51385 1727204600.62002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204600.62042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204600.62075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204600.62098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.62113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.62144: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204600.62151: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.62167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.62274: Set connection var ansible_pipelining to False 51385 1727204600.62282: Set connection var ansible_shell_type to sh 51385 1727204600.62298: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204600.62309: Set connection var ansible_timeout to 10 51385 1727204600.62314: Set connection var ansible_connection to ssh 51385 1727204600.62322: Set connection var ansible_shell_executable to /bin/sh 51385 1727204600.62346: variable 'ansible_shell_executable' from source: unknown 51385 1727204600.62353: variable 'ansible_connection' from source: unknown 51385 1727204600.62361: variable 'ansible_module_compression' from source: unknown 51385 1727204600.62370: variable 'ansible_shell_type' from source: unknown 51385 1727204600.62380: variable 'ansible_shell_executable' from source: unknown 51385 1727204600.62386: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.62393: variable 'ansible_pipelining' from source: unknown 51385 1727204600.62400: variable 'ansible_timeout' from source: unknown 51385 1727204600.62407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.62551: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204600.62572: variable 'omit' from source: magic vars 51385 1727204600.62582: starting attempt loop 51385 1727204600.62588: running the handler 51385 1727204600.62607: handler run complete 51385 1727204600.62622: attempt loop complete, returning result 51385 1727204600.62628: _execute() done 51385 1727204600.62634: dumping result to json 51385 1727204600.62641: done dumping result, returning 51385 1727204600.62650: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-6b1f-5706-0000000007f4] 51385 1727204600.62662: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f4 51385 1727204600.62791: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f4 51385 1727204600.62798: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 51385 1727204600.62869: no more pending results, returning what we have 51385 1727204600.62873: results queue empty 51385 1727204600.62874: checking for any_errors_fatal 51385 1727204600.62883: done checking for any_errors_fatal 51385 1727204600.62884: checking for max_fail_percentage 51385 1727204600.62886: done checking for max_fail_percentage 51385 1727204600.62887: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.62888: done checking to see if all hosts have failed 51385 1727204600.62889: getting the remaining hosts for this loop 51385 1727204600.62891: done getting the remaining hosts for this loop 51385 1727204600.62895: getting the next task for host managed-node1 51385 1727204600.62905: done getting next task for host managed-node1 51385 1727204600.62907: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 51385 1727204600.62912: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.62917: getting variables 51385 1727204600.62919: in VariableManager get_vars() 51385 1727204600.62971: Calling all_inventory to load vars for managed-node1 51385 1727204600.62974: Calling groups_inventory to load vars for managed-node1 51385 1727204600.62978: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.62989: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.62993: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.62997: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.65509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.67198: done with get_vars() 51385 1727204600.67225: done getting variables 51385 1727204600.67297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204600.67444: variable 'profile' from source: include params 51385 1727204600.67448: variable 'item' from source: include params 51385 1727204600.67512: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.086) 0:00:19.079 ***** 51385 1727204600.67550: entering _queue_task() for managed-node1/command 51385 1727204600.67871: worker is 1 (out of 1 available) 51385 1727204600.67885: exiting _queue_task() for managed-node1/command 51385 1727204600.67896: done queuing things up, now waiting for results queue to drain 51385 1727204600.67898: waiting for pending results... 51385 1727204600.68192: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 51385 1727204600.68316: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f6 51385 1727204600.68338: variable 'ansible_search_path' from source: unknown 51385 1727204600.68349: variable 'ansible_search_path' from source: unknown 51385 1727204600.68394: calling self._execute() 51385 1727204600.68496: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.68507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.68520: variable 'omit' from source: magic vars 51385 1727204600.68903: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.68921: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.69045: variable 'profile_stat' from source: set_fact 51385 1727204600.69068: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204600.69077: when evaluation is False, skipping this task 51385 1727204600.69084: _execute() done 51385 1727204600.69092: dumping result to json 51385 1727204600.69102: done dumping result, returning 51385 1727204600.69114: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 [0affcd87-79f5-6b1f-5706-0000000007f6] 51385 1727204600.69125: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f6 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204600.69276: no more pending results, returning what we have 51385 1727204600.69280: results queue empty 51385 1727204600.69282: checking for any_errors_fatal 51385 1727204600.69290: done checking for any_errors_fatal 51385 1727204600.69290: checking for max_fail_percentage 51385 1727204600.69292: done checking for max_fail_percentage 51385 1727204600.69293: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.69295: done checking to see if all hosts have failed 51385 1727204600.69295: getting the remaining hosts for this loop 51385 1727204600.69297: done getting the remaining hosts for this loop 51385 1727204600.69302: getting the next task for host managed-node1 51385 1727204600.69310: done getting next task for host managed-node1 51385 1727204600.69312: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 51385 1727204600.69317: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.69323: getting variables 51385 1727204600.69325: in VariableManager get_vars() 51385 1727204600.69377: Calling all_inventory to load vars for managed-node1 51385 1727204600.69380: Calling groups_inventory to load vars for managed-node1 51385 1727204600.69383: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.69397: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.69400: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.69403: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.70383: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f6 51385 1727204600.70387: WORKER PROCESS EXITING 51385 1727204600.71298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.72935: done with get_vars() 51385 1727204600.72965: done getting variables 51385 1727204600.73728: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204600.73851: variable 'profile' from source: include params 51385 1727204600.73855: variable 'item' from source: include params 51385 1727204600.73921: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.064) 0:00:19.143 ***** 51385 1727204600.73954: entering _queue_task() for managed-node1/set_fact 51385 1727204600.74290: worker is 1 (out of 1 available) 51385 1727204600.74302: exiting _queue_task() for managed-node1/set_fact 51385 1727204600.74314: done queuing things up, now waiting for results queue to drain 51385 1727204600.74315: waiting for pending results... 51385 1727204600.74823: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 51385 1727204600.74950: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f7 51385 1727204600.74966: variable 'ansible_search_path' from source: unknown 51385 1727204600.74970: variable 'ansible_search_path' from source: unknown 51385 1727204600.75005: calling self._execute() 51385 1727204600.75119: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.75122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.75133: variable 'omit' from source: magic vars 51385 1727204600.75541: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.75554: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.75684: variable 'profile_stat' from source: set_fact 51385 1727204600.75701: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204600.75705: when evaluation is False, skipping this task 51385 1727204600.75707: _execute() done 51385 1727204600.75710: dumping result to json 51385 1727204600.75712: done dumping result, returning 51385 1727204600.75720: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [0affcd87-79f5-6b1f-5706-0000000007f7] 51385 1727204600.75726: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f7 51385 1727204600.75820: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f7 51385 1727204600.75823: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204600.75898: no more pending results, returning what we have 51385 1727204600.75902: results queue empty 51385 1727204600.75903: checking for any_errors_fatal 51385 1727204600.75909: done checking for any_errors_fatal 51385 1727204600.75910: checking for max_fail_percentage 51385 1727204600.75912: done checking for max_fail_percentage 51385 1727204600.75913: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.75914: done checking to see if all hosts have failed 51385 1727204600.75915: getting the remaining hosts for this loop 51385 1727204600.75916: done getting the remaining hosts for this loop 51385 1727204600.75920: getting the next task for host managed-node1 51385 1727204600.75926: done getting next task for host managed-node1 51385 1727204600.75928: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 51385 1727204600.75932: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.75936: getting variables 51385 1727204600.75937: in VariableManager get_vars() 51385 1727204600.75979: Calling all_inventory to load vars for managed-node1 51385 1727204600.75982: Calling groups_inventory to load vars for managed-node1 51385 1727204600.75984: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.75994: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.75996: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.75999: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.78215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.82105: done with get_vars() 51385 1727204600.82142: done getting variables 51385 1727204600.82214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204600.82342: variable 'profile' from source: include params 51385 1727204600.82346: variable 'item' from source: include params 51385 1727204600.82414: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.084) 0:00:19.228 ***** 51385 1727204600.82450: entering _queue_task() for managed-node1/command 51385 1727204600.83107: worker is 1 (out of 1 available) 51385 1727204600.83120: exiting _queue_task() for managed-node1/command 51385 1727204600.83134: done queuing things up, now waiting for results queue to drain 51385 1727204600.83135: waiting for pending results... 51385 1727204600.84050: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-lsr101 51385 1727204600.84444: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f8 51385 1727204600.84469: variable 'ansible_search_path' from source: unknown 51385 1727204600.84477: variable 'ansible_search_path' from source: unknown 51385 1727204600.84519: calling self._execute() 51385 1727204600.84628: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.84757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.84778: variable 'omit' from source: magic vars 51385 1727204600.85500: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.85641: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.85782: variable 'profile_stat' from source: set_fact 51385 1727204600.85800: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204600.85806: when evaluation is False, skipping this task 51385 1727204600.85813: _execute() done 51385 1727204600.85820: dumping result to json 51385 1727204600.85827: done dumping result, returning 51385 1727204600.85837: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-lsr101 [0affcd87-79f5-6b1f-5706-0000000007f8] 51385 1727204600.85849: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f8 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204600.86001: no more pending results, returning what we have 51385 1727204600.86005: results queue empty 51385 1727204600.86007: checking for any_errors_fatal 51385 1727204600.86013: done checking for any_errors_fatal 51385 1727204600.86014: checking for max_fail_percentage 51385 1727204600.86015: done checking for max_fail_percentage 51385 1727204600.86017: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.86017: done checking to see if all hosts have failed 51385 1727204600.86018: getting the remaining hosts for this loop 51385 1727204600.86020: done getting the remaining hosts for this loop 51385 1727204600.86024: getting the next task for host managed-node1 51385 1727204600.86032: done getting next task for host managed-node1 51385 1727204600.86035: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 51385 1727204600.86040: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.86045: getting variables 51385 1727204600.86047: in VariableManager get_vars() 51385 1727204600.86097: Calling all_inventory to load vars for managed-node1 51385 1727204600.86101: Calling groups_inventory to load vars for managed-node1 51385 1727204600.86103: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.86117: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.86120: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.86123: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.87084: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f8 51385 1727204600.87087: WORKER PROCESS EXITING 51385 1727204600.88009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.89682: done with get_vars() 51385 1727204600.89708: done getting variables 51385 1727204600.89772: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204600.89890: variable 'profile' from source: include params 51385 1727204600.89894: variable 'item' from source: include params 51385 1727204600.89947: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.075) 0:00:19.303 ***** 51385 1727204600.89987: entering _queue_task() for managed-node1/set_fact 51385 1727204600.90314: worker is 1 (out of 1 available) 51385 1727204600.90327: exiting _queue_task() for managed-node1/set_fact 51385 1727204600.90340: done queuing things up, now waiting for results queue to drain 51385 1727204600.90341: waiting for pending results... 51385 1727204600.91202: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 51385 1727204600.91570: in run() - task 0affcd87-79f5-6b1f-5706-0000000007f9 51385 1727204600.91591: variable 'ansible_search_path' from source: unknown 51385 1727204600.91599: variable 'ansible_search_path' from source: unknown 51385 1727204600.91755: calling self._execute() 51385 1727204600.91979: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.91991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.92007: variable 'omit' from source: magic vars 51385 1727204600.92782: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.92862: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.93107: variable 'profile_stat' from source: set_fact 51385 1727204600.93305: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204600.93313: when evaluation is False, skipping this task 51385 1727204600.93320: _execute() done 51385 1727204600.93327: dumping result to json 51385 1727204600.93336: done dumping result, returning 51385 1727204600.93346: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 [0affcd87-79f5-6b1f-5706-0000000007f9] 51385 1727204600.93370: sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f9 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204600.93561: no more pending results, returning what we have 51385 1727204600.93568: results queue empty 51385 1727204600.93569: checking for any_errors_fatal 51385 1727204600.93576: done checking for any_errors_fatal 51385 1727204600.93577: checking for max_fail_percentage 51385 1727204600.93579: done checking for max_fail_percentage 51385 1727204600.93580: checking to see if all hosts have failed and the running result is not ok 51385 1727204600.93581: done checking to see if all hosts have failed 51385 1727204600.93581: getting the remaining hosts for this loop 51385 1727204600.93583: done getting the remaining hosts for this loop 51385 1727204600.93587: getting the next task for host managed-node1 51385 1727204600.93597: done getting next task for host managed-node1 51385 1727204600.93600: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 51385 1727204600.93604: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204600.93611: getting variables 51385 1727204600.93613: in VariableManager get_vars() 51385 1727204600.93665: Calling all_inventory to load vars for managed-node1 51385 1727204600.93669: Calling groups_inventory to load vars for managed-node1 51385 1727204600.93671: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204600.93685: Calling all_plugins_play to load vars for managed-node1 51385 1727204600.93688: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204600.93691: Calling groups_plugins_play to load vars for managed-node1 51385 1727204600.94901: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000007f9 51385 1727204600.94905: WORKER PROCESS EXITING 51385 1727204600.95775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204600.97755: done with get_vars() 51385 1727204600.97788: done getting variables 51385 1727204600.97857: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204600.97998: variable 'profile' from source: include params 51385 1727204600.98002: variable 'item' from source: include params 51385 1727204600.98073: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.081) 0:00:19.384 ***** 51385 1727204600.98106: entering _queue_task() for managed-node1/assert 51385 1727204600.98440: worker is 1 (out of 1 available) 51385 1727204600.98452: exiting _queue_task() for managed-node1/assert 51385 1727204600.98468: done queuing things up, now waiting for results queue to drain 51385 1727204600.98470: waiting for pending results... 51385 1727204600.98750: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'lsr101' 51385 1727204600.98876: in run() - task 0affcd87-79f5-6b1f-5706-0000000006b9 51385 1727204600.98895: variable 'ansible_search_path' from source: unknown 51385 1727204600.98902: variable 'ansible_search_path' from source: unknown 51385 1727204600.98945: calling self._execute() 51385 1727204600.99049: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.99062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204600.99078: variable 'omit' from source: magic vars 51385 1727204600.99456: variable 'ansible_distribution_major_version' from source: facts 51385 1727204600.99479: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204600.99491: variable 'omit' from source: magic vars 51385 1727204600.99533: variable 'omit' from source: magic vars 51385 1727204600.99642: variable 'profile' from source: include params 51385 1727204600.99652: variable 'item' from source: include params 51385 1727204600.99719: variable 'item' from source: include params 51385 1727204600.99741: variable 'omit' from source: magic vars 51385 1727204600.99794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204600.99833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204600.99861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204600.99890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.99906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204600.99939: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204600.99947: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204600.99953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.00055: Set connection var ansible_pipelining to False 51385 1727204601.00068: Set connection var ansible_shell_type to sh 51385 1727204601.00083: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.00093: Set connection var ansible_timeout to 10 51385 1727204601.00098: Set connection var ansible_connection to ssh 51385 1727204601.00110: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.00135: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.00141: variable 'ansible_connection' from source: unknown 51385 1727204601.00147: variable 'ansible_module_compression' from source: unknown 51385 1727204601.00152: variable 'ansible_shell_type' from source: unknown 51385 1727204601.00157: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.00166: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.00174: variable 'ansible_pipelining' from source: unknown 51385 1727204601.00180: variable 'ansible_timeout' from source: unknown 51385 1727204601.00187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.00337: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.00356: variable 'omit' from source: magic vars 51385 1727204601.00372: starting attempt loop 51385 1727204601.00379: running the handler 51385 1727204601.00498: variable 'lsr_net_profile_exists' from source: set_fact 51385 1727204601.00508: Evaluated conditional (lsr_net_profile_exists): True 51385 1727204601.00520: handler run complete 51385 1727204601.00544: attempt loop complete, returning result 51385 1727204601.00551: _execute() done 51385 1727204601.00558: dumping result to json 51385 1727204601.00572: done dumping result, returning 51385 1727204601.00582: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'lsr101' [0affcd87-79f5-6b1f-5706-0000000006b9] 51385 1727204601.00594: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006b9 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204601.00747: no more pending results, returning what we have 51385 1727204601.00751: results queue empty 51385 1727204601.00752: checking for any_errors_fatal 51385 1727204601.00762: done checking for any_errors_fatal 51385 1727204601.00762: checking for max_fail_percentage 51385 1727204601.00766: done checking for max_fail_percentage 51385 1727204601.00767: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.00768: done checking to see if all hosts have failed 51385 1727204601.00769: getting the remaining hosts for this loop 51385 1727204601.00771: done getting the remaining hosts for this loop 51385 1727204601.00775: getting the next task for host managed-node1 51385 1727204601.00782: done getting next task for host managed-node1 51385 1727204601.00785: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 51385 1727204601.00788: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.00793: getting variables 51385 1727204601.00795: in VariableManager get_vars() 51385 1727204601.00840: Calling all_inventory to load vars for managed-node1 51385 1727204601.00843: Calling groups_inventory to load vars for managed-node1 51385 1727204601.00846: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.00861: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.00867: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.00870: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.02502: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006b9 51385 1727204601.02506: WORKER PROCESS EXITING 51385 1727204601.02658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.04483: done with get_vars() 51385 1727204601.04520: done getting variables 51385 1727204601.04599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204601.04725: variable 'profile' from source: include params 51385 1727204601.04729: variable 'item' from source: include params 51385 1727204601.04795: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.067) 0:00:19.452 ***** 51385 1727204601.04842: entering _queue_task() for managed-node1/assert 51385 1727204601.05208: worker is 1 (out of 1 available) 51385 1727204601.05229: exiting _queue_task() for managed-node1/assert 51385 1727204601.05249: done queuing things up, now waiting for results queue to drain 51385 1727204601.05250: waiting for pending results... 51385 1727204601.05581: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'lsr101' 51385 1727204601.05731: in run() - task 0affcd87-79f5-6b1f-5706-0000000006ba 51385 1727204601.05751: variable 'ansible_search_path' from source: unknown 51385 1727204601.05758: variable 'ansible_search_path' from source: unknown 51385 1727204601.05810: calling self._execute() 51385 1727204601.05919: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.05932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.05946: variable 'omit' from source: magic vars 51385 1727204601.06350: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.06377: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.06389: variable 'omit' from source: magic vars 51385 1727204601.06433: variable 'omit' from source: magic vars 51385 1727204601.06553: variable 'profile' from source: include params 51385 1727204601.06568: variable 'item' from source: include params 51385 1727204601.06642: variable 'item' from source: include params 51385 1727204601.06673: variable 'omit' from source: magic vars 51385 1727204601.06721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204601.06765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204601.06798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204601.06822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.06841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.06882: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204601.06889: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.06895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.07017: Set connection var ansible_pipelining to False 51385 1727204601.07025: Set connection var ansible_shell_type to sh 51385 1727204601.07040: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.07051: Set connection var ansible_timeout to 10 51385 1727204601.07056: Set connection var ansible_connection to ssh 51385 1727204601.07071: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.07098: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.07105: variable 'ansible_connection' from source: unknown 51385 1727204601.07110: variable 'ansible_module_compression' from source: unknown 51385 1727204601.07115: variable 'ansible_shell_type' from source: unknown 51385 1727204601.07132: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.07139: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.07147: variable 'ansible_pipelining' from source: unknown 51385 1727204601.07155: variable 'ansible_timeout' from source: unknown 51385 1727204601.07174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.07321: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.07343: variable 'omit' from source: magic vars 51385 1727204601.07355: starting attempt loop 51385 1727204601.07373: running the handler 51385 1727204601.07507: variable 'lsr_net_profile_ansible_managed' from source: set_fact 51385 1727204601.07517: Evaluated conditional (lsr_net_profile_ansible_managed): True 51385 1727204601.07527: handler run complete 51385 1727204601.07544: attempt loop complete, returning result 51385 1727204601.07550: _execute() done 51385 1727204601.07562: dumping result to json 51385 1727204601.07572: done dumping result, returning 51385 1727204601.07583: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'lsr101' [0affcd87-79f5-6b1f-5706-0000000006ba] 51385 1727204601.07594: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006ba ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204601.07747: no more pending results, returning what we have 51385 1727204601.07752: results queue empty 51385 1727204601.07753: checking for any_errors_fatal 51385 1727204601.07758: done checking for any_errors_fatal 51385 1727204601.07762: checking for max_fail_percentage 51385 1727204601.07765: done checking for max_fail_percentage 51385 1727204601.07766: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.07767: done checking to see if all hosts have failed 51385 1727204601.07768: getting the remaining hosts for this loop 51385 1727204601.07770: done getting the remaining hosts for this loop 51385 1727204601.07773: getting the next task for host managed-node1 51385 1727204601.07781: done getting next task for host managed-node1 51385 1727204601.07784: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 51385 1727204601.07788: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.07794: getting variables 51385 1727204601.07796: in VariableManager get_vars() 51385 1727204601.07843: Calling all_inventory to load vars for managed-node1 51385 1727204601.07847: Calling groups_inventory to load vars for managed-node1 51385 1727204601.07850: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.07868: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.07872: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.07875: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.08994: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006ba 51385 1727204601.08998: WORKER PROCESS EXITING 51385 1727204601.09990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.11772: done with get_vars() 51385 1727204601.11806: done getting variables 51385 1727204601.11879: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204601.11997: variable 'profile' from source: include params 51385 1727204601.12001: variable 'item' from source: include params 51385 1727204601.12066: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.072) 0:00:19.525 ***** 51385 1727204601.12117: entering _queue_task() for managed-node1/assert 51385 1727204601.12456: worker is 1 (out of 1 available) 51385 1727204601.12474: exiting _queue_task() for managed-node1/assert 51385 1727204601.12488: done queuing things up, now waiting for results queue to drain 51385 1727204601.12489: waiting for pending results... 51385 1727204601.12785: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in lsr101 51385 1727204601.12898: in run() - task 0affcd87-79f5-6b1f-5706-0000000006bb 51385 1727204601.12918: variable 'ansible_search_path' from source: unknown 51385 1727204601.12941: variable 'ansible_search_path' from source: unknown 51385 1727204601.12990: calling self._execute() 51385 1727204601.13095: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.13107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.13121: variable 'omit' from source: magic vars 51385 1727204601.13529: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.13547: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.13558: variable 'omit' from source: magic vars 51385 1727204601.13613: variable 'omit' from source: magic vars 51385 1727204601.13724: variable 'profile' from source: include params 51385 1727204601.13732: variable 'item' from source: include params 51385 1727204601.13801: variable 'item' from source: include params 51385 1727204601.13827: variable 'omit' from source: magic vars 51385 1727204601.13879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204601.13921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204601.13951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204601.13979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.13996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.14049: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204601.14057: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.14071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.14181: Set connection var ansible_pipelining to False 51385 1727204601.14189: Set connection var ansible_shell_type to sh 51385 1727204601.14211: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.14230: Set connection var ansible_timeout to 10 51385 1727204601.14241: Set connection var ansible_connection to ssh 51385 1727204601.14252: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.14285: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.14293: variable 'ansible_connection' from source: unknown 51385 1727204601.14301: variable 'ansible_module_compression' from source: unknown 51385 1727204601.14307: variable 'ansible_shell_type' from source: unknown 51385 1727204601.14313: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.14319: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.14326: variable 'ansible_pipelining' from source: unknown 51385 1727204601.14332: variable 'ansible_timeout' from source: unknown 51385 1727204601.14339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.14508: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.14526: variable 'omit' from source: magic vars 51385 1727204601.14537: starting attempt loop 51385 1727204601.14542: running the handler 51385 1727204601.14668: variable 'lsr_net_profile_fingerprint' from source: set_fact 51385 1727204601.14680: Evaluated conditional (lsr_net_profile_fingerprint): True 51385 1727204601.14691: handler run complete 51385 1727204601.14709: attempt loop complete, returning result 51385 1727204601.14716: _execute() done 51385 1727204601.14722: dumping result to json 51385 1727204601.14729: done dumping result, returning 51385 1727204601.14739: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in lsr101 [0affcd87-79f5-6b1f-5706-0000000006bb] 51385 1727204601.14751: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006bb 51385 1727204601.14865: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006bb 51385 1727204601.14875: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204601.14932: no more pending results, returning what we have 51385 1727204601.14936: results queue empty 51385 1727204601.14937: checking for any_errors_fatal 51385 1727204601.14944: done checking for any_errors_fatal 51385 1727204601.14945: checking for max_fail_percentage 51385 1727204601.14947: done checking for max_fail_percentage 51385 1727204601.14948: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.14949: done checking to see if all hosts have failed 51385 1727204601.14950: getting the remaining hosts for this loop 51385 1727204601.14952: done getting the remaining hosts for this loop 51385 1727204601.14956: getting the next task for host managed-node1 51385 1727204601.14970: done getting next task for host managed-node1 51385 1727204601.14974: ^ task is: TASK: Include the task 'get_profile_stat.yml' 51385 1727204601.14978: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.14982: getting variables 51385 1727204601.14984: in VariableManager get_vars() 51385 1727204601.15031: Calling all_inventory to load vars for managed-node1 51385 1727204601.15034: Calling groups_inventory to load vars for managed-node1 51385 1727204601.15037: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.15049: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.15053: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.15056: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.17737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.20824: done with get_vars() 51385 1727204601.20860: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.088) 0:00:19.613 ***** 51385 1727204601.20963: entering _queue_task() for managed-node1/include_tasks 51385 1727204601.21292: worker is 1 (out of 1 available) 51385 1727204601.21305: exiting _queue_task() for managed-node1/include_tasks 51385 1727204601.21319: done queuing things up, now waiting for results queue to drain 51385 1727204601.21320: waiting for pending results... 51385 1727204601.21602: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 51385 1727204601.21720: in run() - task 0affcd87-79f5-6b1f-5706-0000000006bf 51385 1727204601.21739: variable 'ansible_search_path' from source: unknown 51385 1727204601.21747: variable 'ansible_search_path' from source: unknown 51385 1727204601.21794: calling self._execute() 51385 1727204601.21896: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.21907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.21919: variable 'omit' from source: magic vars 51385 1727204601.22387: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.22404: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.22413: _execute() done 51385 1727204601.22421: dumping result to json 51385 1727204601.22431: done dumping result, returning 51385 1727204601.22441: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-6b1f-5706-0000000006bf] 51385 1727204601.22452: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006bf 51385 1727204601.22581: no more pending results, returning what we have 51385 1727204601.22587: in VariableManager get_vars() 51385 1727204601.22638: Calling all_inventory to load vars for managed-node1 51385 1727204601.22641: Calling groups_inventory to load vars for managed-node1 51385 1727204601.22644: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.22659: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.22662: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.22667: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.24155: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006bf 51385 1727204601.24160: WORKER PROCESS EXITING 51385 1727204601.32481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.35938: done with get_vars() 51385 1727204601.35977: variable 'ansible_search_path' from source: unknown 51385 1727204601.35979: variable 'ansible_search_path' from source: unknown 51385 1727204601.36049: we have included files to process 51385 1727204601.36050: generating all_blocks data 51385 1727204601.36052: done generating all_blocks data 51385 1727204601.36055: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204601.36056: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204601.36058: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 51385 1727204601.37098: done processing included file 51385 1727204601.37101: iterating over new_blocks loaded from include file 51385 1727204601.37102: in VariableManager get_vars() 51385 1727204601.37125: done with get_vars() 51385 1727204601.37127: filtering new block on tags 51385 1727204601.37152: done filtering new block on tags 51385 1727204601.37155: in VariableManager get_vars() 51385 1727204601.37179: done with get_vars() 51385 1727204601.37181: filtering new block on tags 51385 1727204601.37203: done filtering new block on tags 51385 1727204601.37205: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 51385 1727204601.37210: extending task lists for all hosts with included blocks 51385 1727204601.37390: done extending task lists 51385 1727204601.37391: done processing included files 51385 1727204601.37392: results queue empty 51385 1727204601.37393: checking for any_errors_fatal 51385 1727204601.37396: done checking for any_errors_fatal 51385 1727204601.37396: checking for max_fail_percentage 51385 1727204601.37398: done checking for max_fail_percentage 51385 1727204601.37398: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.37399: done checking to see if all hosts have failed 51385 1727204601.37400: getting the remaining hosts for this loop 51385 1727204601.37401: done getting the remaining hosts for this loop 51385 1727204601.37403: getting the next task for host managed-node1 51385 1727204601.37407: done getting next task for host managed-node1 51385 1727204601.37409: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 51385 1727204601.37412: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.37414: getting variables 51385 1727204601.37415: in VariableManager get_vars() 51385 1727204601.37429: Calling all_inventory to load vars for managed-node1 51385 1727204601.37432: Calling groups_inventory to load vars for managed-node1 51385 1727204601.37434: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.37439: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.37441: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.37444: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.38968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.40758: done with get_vars() 51385 1727204601.40793: done getting variables 51385 1727204601.40841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.199) 0:00:19.812 ***** 51385 1727204601.40874: entering _queue_task() for managed-node1/set_fact 51385 1727204601.41212: worker is 1 (out of 1 available) 51385 1727204601.41226: exiting _queue_task() for managed-node1/set_fact 51385 1727204601.41238: done queuing things up, now waiting for results queue to drain 51385 1727204601.41239: waiting for pending results... 51385 1727204601.41530: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 51385 1727204601.41662: in run() - task 0affcd87-79f5-6b1f-5706-000000000838 51385 1727204601.41687: variable 'ansible_search_path' from source: unknown 51385 1727204601.41694: variable 'ansible_search_path' from source: unknown 51385 1727204601.41734: calling self._execute() 51385 1727204601.41835: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.41848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.41862: variable 'omit' from source: magic vars 51385 1727204601.42361: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.42383: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.42394: variable 'omit' from source: magic vars 51385 1727204601.42480: variable 'omit' from source: magic vars 51385 1727204601.42524: variable 'omit' from source: magic vars 51385 1727204601.42623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204601.42680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204601.42707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204601.42740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.42755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.42795: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204601.42803: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.42811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.42915: Set connection var ansible_pipelining to False 51385 1727204601.42924: Set connection var ansible_shell_type to sh 51385 1727204601.42937: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.42948: Set connection var ansible_timeout to 10 51385 1727204601.42955: Set connection var ansible_connection to ssh 51385 1727204601.42967: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.43001: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.43009: variable 'ansible_connection' from source: unknown 51385 1727204601.43017: variable 'ansible_module_compression' from source: unknown 51385 1727204601.43023: variable 'ansible_shell_type' from source: unknown 51385 1727204601.43030: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.43036: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.43044: variable 'ansible_pipelining' from source: unknown 51385 1727204601.43050: variable 'ansible_timeout' from source: unknown 51385 1727204601.43057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.43211: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.43230: variable 'omit' from source: magic vars 51385 1727204601.43243: starting attempt loop 51385 1727204601.43249: running the handler 51385 1727204601.43266: handler run complete 51385 1727204601.43282: attempt loop complete, returning result 51385 1727204601.43289: _execute() done 51385 1727204601.43295: dumping result to json 51385 1727204601.43306: done dumping result, returning 51385 1727204601.43317: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-6b1f-5706-000000000838] 51385 1727204601.43326: sending task result for task 0affcd87-79f5-6b1f-5706-000000000838 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 51385 1727204601.43478: no more pending results, returning what we have 51385 1727204601.43482: results queue empty 51385 1727204601.43483: checking for any_errors_fatal 51385 1727204601.43485: done checking for any_errors_fatal 51385 1727204601.43486: checking for max_fail_percentage 51385 1727204601.43488: done checking for max_fail_percentage 51385 1727204601.43489: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.43490: done checking to see if all hosts have failed 51385 1727204601.43490: getting the remaining hosts for this loop 51385 1727204601.43492: done getting the remaining hosts for this loop 51385 1727204601.43496: getting the next task for host managed-node1 51385 1727204601.43503: done getting next task for host managed-node1 51385 1727204601.43506: ^ task is: TASK: Stat profile file 51385 1727204601.43511: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.43516: getting variables 51385 1727204601.43518: in VariableManager get_vars() 51385 1727204601.43566: Calling all_inventory to load vars for managed-node1 51385 1727204601.43569: Calling groups_inventory to load vars for managed-node1 51385 1727204601.43572: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.43583: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.43586: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.43589: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.44803: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000838 51385 1727204601.44807: WORKER PROCESS EXITING 51385 1727204601.45612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.46566: done with get_vars() 51385 1727204601.46589: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.057) 0:00:19.870 ***** 51385 1727204601.46659: entering _queue_task() for managed-node1/stat 51385 1727204601.46907: worker is 1 (out of 1 available) 51385 1727204601.46921: exiting _queue_task() for managed-node1/stat 51385 1727204601.46933: done queuing things up, now waiting for results queue to drain 51385 1727204601.46934: waiting for pending results... 51385 1727204601.47145: running TaskExecutor() for managed-node1/TASK: Stat profile file 51385 1727204601.47292: in run() - task 0affcd87-79f5-6b1f-5706-000000000839 51385 1727204601.47297: variable 'ansible_search_path' from source: unknown 51385 1727204601.47300: variable 'ansible_search_path' from source: unknown 51385 1727204601.47354: calling self._execute() 51385 1727204601.47476: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.47479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.47483: variable 'omit' from source: magic vars 51385 1727204601.47836: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.47849: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.47856: variable 'omit' from source: magic vars 51385 1727204601.47912: variable 'omit' from source: magic vars 51385 1727204601.48036: variable 'profile' from source: include params 51385 1727204601.48041: variable 'item' from source: include params 51385 1727204601.48114: variable 'item' from source: include params 51385 1727204601.48133: variable 'omit' from source: magic vars 51385 1727204601.48179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204601.48240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204601.48267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204601.48291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.48307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.48361: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204601.48377: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.48380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.48521: Set connection var ansible_pipelining to False 51385 1727204601.48535: Set connection var ansible_shell_type to sh 51385 1727204601.48549: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.48556: Set connection var ansible_timeout to 10 51385 1727204601.48561: Set connection var ansible_connection to ssh 51385 1727204601.48569: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.48600: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.48604: variable 'ansible_connection' from source: unknown 51385 1727204601.48606: variable 'ansible_module_compression' from source: unknown 51385 1727204601.48608: variable 'ansible_shell_type' from source: unknown 51385 1727204601.48611: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.48613: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.48626: variable 'ansible_pipelining' from source: unknown 51385 1727204601.48630: variable 'ansible_timeout' from source: unknown 51385 1727204601.48632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.48807: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204601.48815: variable 'omit' from source: magic vars 51385 1727204601.48821: starting attempt loop 51385 1727204601.48824: running the handler 51385 1727204601.48836: _low_level_execute_command(): starting 51385 1727204601.48843: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204601.49360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.49376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.49400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.49415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.49466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.49480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.49491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.49559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.51214: stdout chunk (state=3): >>>/root <<< 51385 1727204601.51329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.51407: stderr chunk (state=3): >>><<< 51385 1727204601.51415: stdout chunk (state=3): >>><<< 51385 1727204601.51450: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.51470: _low_level_execute_command(): starting 51385 1727204601.51474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129 `" && echo ansible-tmp-1727204601.5145051-52848-169809064088129="` echo /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129 `" ) && sleep 0' 51385 1727204601.52240: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.52262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.52268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.52306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.52318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.52336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.52370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.52390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.52393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.52457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.54326: stdout chunk (state=3): >>>ansible-tmp-1727204601.5145051-52848-169809064088129=/root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129 <<< 51385 1727204601.54445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.54550: stderr chunk (state=3): >>><<< 51385 1727204601.54564: stdout chunk (state=3): >>><<< 51385 1727204601.54841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204601.5145051-52848-169809064088129=/root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.54849: variable 'ansible_module_compression' from source: unknown 51385 1727204601.54851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 51385 1727204601.54853: variable 'ansible_facts' from source: unknown 51385 1727204601.54892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/AnsiballZ_stat.py 51385 1727204601.55087: Sending initial data 51385 1727204601.55090: Sent initial data (153 bytes) 51385 1727204601.56181: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.56213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.56244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.56500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.56503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 51385 1727204601.56506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.56604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.56629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.56730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.58435: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204601.58554: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204601.58604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpt0db8vb4 /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/AnsiballZ_stat.py <<< 51385 1727204601.58671: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204601.59995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.60112: stderr chunk (state=3): >>><<< 51385 1727204601.60115: stdout chunk (state=3): >>><<< 51385 1727204601.60118: done transferring module to remote 51385 1727204601.60120: _low_level_execute_command(): starting 51385 1727204601.60127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/ /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/AnsiballZ_stat.py && sleep 0' 51385 1727204601.60725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.60734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.60745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.60758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.60802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.60809: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204601.60819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.60832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204601.60840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204601.60848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204601.60857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.60871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.60879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.60887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.60897: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204601.60900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.60973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.60992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.61000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.61083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.62785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.62836: stderr chunk (state=3): >>><<< 51385 1727204601.62840: stdout chunk (state=3): >>><<< 51385 1727204601.62861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.62866: _low_level_execute_command(): starting 51385 1727204601.62869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/AnsiballZ_stat.py && sleep 0' 51385 1727204601.63484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.63488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.63502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204601.63508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.63582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.63589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.63595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.63671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.76694: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 51385 1727204601.77599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204601.77660: stderr chunk (state=3): >>><<< 51385 1727204601.77666: stdout chunk (state=3): >>><<< 51385 1727204601.77687: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204601.77710: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204601.77722: _low_level_execute_command(): starting 51385 1727204601.77725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204601.5145051-52848-169809064088129/ > /dev/null 2>&1 && sleep 0' 51385 1727204601.78205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.78209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.78251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.78255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.78273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.78279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.78333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.78346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.78409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.80138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.80198: stderr chunk (state=3): >>><<< 51385 1727204601.80201: stdout chunk (state=3): >>><<< 51385 1727204601.80216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.80222: handler run complete 51385 1727204601.80238: attempt loop complete, returning result 51385 1727204601.80241: _execute() done 51385 1727204601.80243: dumping result to json 51385 1727204601.80245: done dumping result, returning 51385 1727204601.80255: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-6b1f-5706-000000000839] 51385 1727204601.80265: sending task result for task 0affcd87-79f5-6b1f-5706-000000000839 51385 1727204601.80358: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000839 51385 1727204601.80363: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 51385 1727204601.80421: no more pending results, returning what we have 51385 1727204601.80425: results queue empty 51385 1727204601.80426: checking for any_errors_fatal 51385 1727204601.80432: done checking for any_errors_fatal 51385 1727204601.80433: checking for max_fail_percentage 51385 1727204601.80435: done checking for max_fail_percentage 51385 1727204601.80435: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.80436: done checking to see if all hosts have failed 51385 1727204601.80438: getting the remaining hosts for this loop 51385 1727204601.80439: done getting the remaining hosts for this loop 51385 1727204601.80443: getting the next task for host managed-node1 51385 1727204601.80450: done getting next task for host managed-node1 51385 1727204601.80452: ^ task is: TASK: Set NM profile exist flag based on the profile files 51385 1727204601.80456: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.80462: getting variables 51385 1727204601.80471: in VariableManager get_vars() 51385 1727204601.80515: Calling all_inventory to load vars for managed-node1 51385 1727204601.80518: Calling groups_inventory to load vars for managed-node1 51385 1727204601.80520: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.80531: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.80533: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.80536: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.81394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.82445: done with get_vars() 51385 1727204601.82466: done getting variables 51385 1727204601.82511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.358) 0:00:20.229 ***** 51385 1727204601.82534: entering _queue_task() for managed-node1/set_fact 51385 1727204601.82767: worker is 1 (out of 1 available) 51385 1727204601.82783: exiting _queue_task() for managed-node1/set_fact 51385 1727204601.82795: done queuing things up, now waiting for results queue to drain 51385 1727204601.82797: waiting for pending results... 51385 1727204601.82987: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 51385 1727204601.83065: in run() - task 0affcd87-79f5-6b1f-5706-00000000083a 51385 1727204601.83078: variable 'ansible_search_path' from source: unknown 51385 1727204601.83081: variable 'ansible_search_path' from source: unknown 51385 1727204601.83111: calling self._execute() 51385 1727204601.83188: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.83191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.83201: variable 'omit' from source: magic vars 51385 1727204601.83487: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.83498: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.83591: variable 'profile_stat' from source: set_fact 51385 1727204601.83601: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204601.83604: when evaluation is False, skipping this task 51385 1727204601.83607: _execute() done 51385 1727204601.83609: dumping result to json 51385 1727204601.83613: done dumping result, returning 51385 1727204601.83619: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-6b1f-5706-00000000083a] 51385 1727204601.83625: sending task result for task 0affcd87-79f5-6b1f-5706-00000000083a 51385 1727204601.83717: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000083a 51385 1727204601.83719: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204601.83800: no more pending results, returning what we have 51385 1727204601.83804: results queue empty 51385 1727204601.83805: checking for any_errors_fatal 51385 1727204601.83814: done checking for any_errors_fatal 51385 1727204601.83815: checking for max_fail_percentage 51385 1727204601.83817: done checking for max_fail_percentage 51385 1727204601.83818: checking to see if all hosts have failed and the running result is not ok 51385 1727204601.83818: done checking to see if all hosts have failed 51385 1727204601.83819: getting the remaining hosts for this loop 51385 1727204601.83821: done getting the remaining hosts for this loop 51385 1727204601.83824: getting the next task for host managed-node1 51385 1727204601.83831: done getting next task for host managed-node1 51385 1727204601.83833: ^ task is: TASK: Get NM profile info 51385 1727204601.83838: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204601.83842: getting variables 51385 1727204601.83843: in VariableManager get_vars() 51385 1727204601.83891: Calling all_inventory to load vars for managed-node1 51385 1727204601.83895: Calling groups_inventory to load vars for managed-node1 51385 1727204601.83897: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204601.83907: Calling all_plugins_play to load vars for managed-node1 51385 1727204601.83910: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204601.83912: Calling groups_plugins_play to load vars for managed-node1 51385 1727204601.84740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204601.86000: done with get_vars() 51385 1727204601.86024: done getting variables 51385 1727204601.86088: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.035) 0:00:20.265 ***** 51385 1727204601.86125: entering _queue_task() for managed-node1/shell 51385 1727204601.86603: worker is 1 (out of 1 available) 51385 1727204601.86618: exiting _queue_task() for managed-node1/shell 51385 1727204601.86631: done queuing things up, now waiting for results queue to drain 51385 1727204601.86632: waiting for pending results... 51385 1727204601.86832: running TaskExecutor() for managed-node1/TASK: Get NM profile info 51385 1727204601.86915: in run() - task 0affcd87-79f5-6b1f-5706-00000000083b 51385 1727204601.86925: variable 'ansible_search_path' from source: unknown 51385 1727204601.86928: variable 'ansible_search_path' from source: unknown 51385 1727204601.86956: calling self._execute() 51385 1727204601.87035: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.87039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.87048: variable 'omit' from source: magic vars 51385 1727204601.87338: variable 'ansible_distribution_major_version' from source: facts 51385 1727204601.87348: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204601.87354: variable 'omit' from source: magic vars 51385 1727204601.87390: variable 'omit' from source: magic vars 51385 1727204601.87462: variable 'profile' from source: include params 51385 1727204601.87470: variable 'item' from source: include params 51385 1727204601.87521: variable 'item' from source: include params 51385 1727204601.87533: variable 'omit' from source: magic vars 51385 1727204601.87571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204601.87598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204601.87615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204601.87631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.87642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204601.87668: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204601.87671: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.87674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.87745: Set connection var ansible_pipelining to False 51385 1727204601.87748: Set connection var ansible_shell_type to sh 51385 1727204601.87756: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204601.87766: Set connection var ansible_timeout to 10 51385 1727204601.87769: Set connection var ansible_connection to ssh 51385 1727204601.87774: Set connection var ansible_shell_executable to /bin/sh 51385 1727204601.87791: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.87794: variable 'ansible_connection' from source: unknown 51385 1727204601.87796: variable 'ansible_module_compression' from source: unknown 51385 1727204601.87798: variable 'ansible_shell_type' from source: unknown 51385 1727204601.87801: variable 'ansible_shell_executable' from source: unknown 51385 1727204601.87803: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204601.87807: variable 'ansible_pipelining' from source: unknown 51385 1727204601.87810: variable 'ansible_timeout' from source: unknown 51385 1727204601.87814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204601.87915: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.87923: variable 'omit' from source: magic vars 51385 1727204601.87928: starting attempt loop 51385 1727204601.87930: running the handler 51385 1727204601.87940: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204601.87959: _low_level_execute_command(): starting 51385 1727204601.87970: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204601.88578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.88594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.88615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.88637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.88682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.88692: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204601.88704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.88724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204601.88738: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204601.88746: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204601.88756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.88771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.88788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.88800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.88811: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204601.88824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.89236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.89240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.89245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.89308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.90847: stdout chunk (state=3): >>>/root <<< 51385 1727204601.90950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.91060: stderr chunk (state=3): >>><<< 51385 1727204601.91076: stdout chunk (state=3): >>><<< 51385 1727204601.91208: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.91220: _low_level_execute_command(): starting 51385 1727204601.91223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026 `" && echo ansible-tmp-1727204601.9111454-52890-238853252439026="` echo /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026 `" ) && sleep 0' 51385 1727204601.91814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.91834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.91851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.91880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.91928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.91941: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204601.91956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.91988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204601.92004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204601.92016: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204601.92029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.92044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.92059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.92074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.92090: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204601.92108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.92185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.92215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.92238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.92331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.94160: stdout chunk (state=3): >>>ansible-tmp-1727204601.9111454-52890-238853252439026=/root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026 <<< 51385 1727204601.94358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.94361: stdout chunk (state=3): >>><<< 51385 1727204601.94365: stderr chunk (state=3): >>><<< 51385 1727204601.94570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204601.9111454-52890-238853252439026=/root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204601.94573: variable 'ansible_module_compression' from source: unknown 51385 1727204601.94576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204601.94578: variable 'ansible_facts' from source: unknown 51385 1727204601.94634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/AnsiballZ_command.py 51385 1727204601.94797: Sending initial data 51385 1727204601.94800: Sent initial data (156 bytes) 51385 1727204601.96054: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204601.96078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.96094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.96113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.96167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.96181: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204601.96196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.96214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204601.96226: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204601.96243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204601.96256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204601.96288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204601.96306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204601.96319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204601.96330: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204601.96346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204601.96423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204601.96447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204601.96469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204601.96581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204601.98278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204601.98383: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204601.98387: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpe78zdiy4 /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/AnsiballZ_command.py <<< 51385 1727204601.98427: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204601.99887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204601.99974: stderr chunk (state=3): >>><<< 51385 1727204601.99978: stdout chunk (state=3): >>><<< 51385 1727204602.00001: done transferring module to remote 51385 1727204602.00013: _low_level_execute_command(): starting 51385 1727204602.00018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/ /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/AnsiballZ_command.py && sleep 0' 51385 1727204602.00728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204602.00983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.00994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.01008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.01054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.01061: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204602.01082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.01096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204602.01103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204602.01111: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204602.01119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.01127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.01138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.01146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.01152: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204602.01161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.01237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204602.01255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204602.01272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204602.01354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204602.03144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204602.03150: stdout chunk (state=3): >>><<< 51385 1727204602.03152: stderr chunk (state=3): >>><<< 51385 1727204602.03216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204602.03220: _low_level_execute_command(): starting 51385 1727204602.03223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/AnsiballZ_command.py && sleep 0' 51385 1727204602.05073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204602.05083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.05094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.05109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.05153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.05157: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204602.05173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.05188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204602.05195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204602.05201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204602.05209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.05218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.05230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.05238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.05245: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204602.05253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.05331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204602.05351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204602.05367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204602.05462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204602.20717: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 15:03:22.185438", "end": "2024-09-24 15:03:22.206713", "delta": "0:00:00.021275", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204602.21883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204602.21940: stderr chunk (state=3): >>><<< 51385 1727204602.21944: stdout chunk (state=3): >>><<< 51385 1727204602.21959: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 15:03:22.185438", "end": "2024-09-24 15:03:22.206713", "delta": "0:00:00.021275", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204602.21994: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204602.22001: _low_level_execute_command(): starting 51385 1727204602.22006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204601.9111454-52890-238853252439026/ > /dev/null 2>&1 && sleep 0' 51385 1727204602.22807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204602.22811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.22814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.22816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.22819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.22821: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204602.22823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.22826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204602.22828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204602.22830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204602.22832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204602.22843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204602.23014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204602.23018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204602.23020: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204602.23022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204602.23024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204602.23026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204602.23028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204602.23080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204602.24845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204602.24901: stderr chunk (state=3): >>><<< 51385 1727204602.24905: stdout chunk (state=3): >>><<< 51385 1727204602.24921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204602.24927: handler run complete 51385 1727204602.24945: Evaluated conditional (False): False 51385 1727204602.24953: attempt loop complete, returning result 51385 1727204602.24956: _execute() done 51385 1727204602.24958: dumping result to json 51385 1727204602.24967: done dumping result, returning 51385 1727204602.24975: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-6b1f-5706-00000000083b] 51385 1727204602.24983: sending task result for task 0affcd87-79f5-6b1f-5706-00000000083b 51385 1727204602.25089: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000083b 51385 1727204602.25092: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.021275", "end": "2024-09-24 15:03:22.206713", "rc": 0, "start": "2024-09-24 15:03:22.185438" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 51385 1727204602.25161: no more pending results, returning what we have 51385 1727204602.25166: results queue empty 51385 1727204602.25168: checking for any_errors_fatal 51385 1727204602.25174: done checking for any_errors_fatal 51385 1727204602.25174: checking for max_fail_percentage 51385 1727204602.25176: done checking for max_fail_percentage 51385 1727204602.25177: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.25178: done checking to see if all hosts have failed 51385 1727204602.25179: getting the remaining hosts for this loop 51385 1727204602.25181: done getting the remaining hosts for this loop 51385 1727204602.25184: getting the next task for host managed-node1 51385 1727204602.25191: done getting next task for host managed-node1 51385 1727204602.25193: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 51385 1727204602.25197: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.25202: getting variables 51385 1727204602.25204: in VariableManager get_vars() 51385 1727204602.25245: Calling all_inventory to load vars for managed-node1 51385 1727204602.25248: Calling groups_inventory to load vars for managed-node1 51385 1727204602.25250: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.25260: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.25263: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.25275: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.26857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.27825: done with get_vars() 51385 1727204602.27845: done getting variables 51385 1727204602.27896: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.417) 0:00:20.683 ***** 51385 1727204602.27920: entering _queue_task() for managed-node1/set_fact 51385 1727204602.28161: worker is 1 (out of 1 available) 51385 1727204602.28178: exiting _queue_task() for managed-node1/set_fact 51385 1727204602.28189: done queuing things up, now waiting for results queue to drain 51385 1727204602.28190: waiting for pending results... 51385 1727204602.28374: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 51385 1727204602.28446: in run() - task 0affcd87-79f5-6b1f-5706-00000000083c 51385 1727204602.28457: variable 'ansible_search_path' from source: unknown 51385 1727204602.28463: variable 'ansible_search_path' from source: unknown 51385 1727204602.28492: calling self._execute() 51385 1727204602.28571: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.28575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.28584: variable 'omit' from source: magic vars 51385 1727204602.28869: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.28873: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.28967: variable 'nm_profile_exists' from source: set_fact 51385 1727204602.28983: Evaluated conditional (nm_profile_exists.rc == 0): True 51385 1727204602.28988: variable 'omit' from source: magic vars 51385 1727204602.29019: variable 'omit' from source: magic vars 51385 1727204602.29040: variable 'omit' from source: magic vars 51385 1727204602.29083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204602.29105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204602.29123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204602.29136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.29146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.29172: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204602.29177: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.29179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.29249: Set connection var ansible_pipelining to False 51385 1727204602.29252: Set connection var ansible_shell_type to sh 51385 1727204602.29262: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204602.29268: Set connection var ansible_timeout to 10 51385 1727204602.29270: Set connection var ansible_connection to ssh 51385 1727204602.29275: Set connection var ansible_shell_executable to /bin/sh 51385 1727204602.29296: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.29301: variable 'ansible_connection' from source: unknown 51385 1727204602.29303: variable 'ansible_module_compression' from source: unknown 51385 1727204602.29305: variable 'ansible_shell_type' from source: unknown 51385 1727204602.29308: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.29310: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.29312: variable 'ansible_pipelining' from source: unknown 51385 1727204602.29319: variable 'ansible_timeout' from source: unknown 51385 1727204602.29321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.29419: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204602.29430: variable 'omit' from source: magic vars 51385 1727204602.29435: starting attempt loop 51385 1727204602.29437: running the handler 51385 1727204602.29448: handler run complete 51385 1727204602.29457: attempt loop complete, returning result 51385 1727204602.29462: _execute() done 51385 1727204602.29466: dumping result to json 51385 1727204602.29469: done dumping result, returning 51385 1727204602.29471: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-6b1f-5706-00000000083c] 51385 1727204602.29477: sending task result for task 0affcd87-79f5-6b1f-5706-00000000083c 51385 1727204602.29566: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000083c 51385 1727204602.29569: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 51385 1727204602.29620: no more pending results, returning what we have 51385 1727204602.29623: results queue empty 51385 1727204602.29624: checking for any_errors_fatal 51385 1727204602.29636: done checking for any_errors_fatal 51385 1727204602.29637: checking for max_fail_percentage 51385 1727204602.29646: done checking for max_fail_percentage 51385 1727204602.29647: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.29648: done checking to see if all hosts have failed 51385 1727204602.29648: getting the remaining hosts for this loop 51385 1727204602.29650: done getting the remaining hosts for this loop 51385 1727204602.29654: getting the next task for host managed-node1 51385 1727204602.29667: done getting next task for host managed-node1 51385 1727204602.29669: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 51385 1727204602.29672: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.29677: getting variables 51385 1727204602.29678: in VariableManager get_vars() 51385 1727204602.29717: Calling all_inventory to load vars for managed-node1 51385 1727204602.29719: Calling groups_inventory to load vars for managed-node1 51385 1727204602.29721: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.29731: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.29734: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.29736: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.30568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.31506: done with get_vars() 51385 1727204602.31527: done getting variables 51385 1727204602.31579: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.31673: variable 'profile' from source: include params 51385 1727204602.31676: variable 'item' from source: include params 51385 1727204602.31722: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.038) 0:00:20.721 ***** 51385 1727204602.31750: entering _queue_task() for managed-node1/command 51385 1727204602.31993: worker is 1 (out of 1 available) 51385 1727204602.32006: exiting _queue_task() for managed-node1/command 51385 1727204602.32019: done queuing things up, now waiting for results queue to drain 51385 1727204602.32021: waiting for pending results... 51385 1727204602.32213: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 51385 1727204602.32298: in run() - task 0affcd87-79f5-6b1f-5706-00000000083e 51385 1727204602.32307: variable 'ansible_search_path' from source: unknown 51385 1727204602.32311: variable 'ansible_search_path' from source: unknown 51385 1727204602.32341: calling self._execute() 51385 1727204602.32424: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.32428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.32438: variable 'omit' from source: magic vars 51385 1727204602.32719: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.32729: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.32819: variable 'profile_stat' from source: set_fact 51385 1727204602.32830: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204602.32833: when evaluation is False, skipping this task 51385 1727204602.32836: _execute() done 51385 1727204602.32840: dumping result to json 51385 1727204602.32843: done dumping result, returning 51385 1727204602.32848: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [0affcd87-79f5-6b1f-5706-00000000083e] 51385 1727204602.32854: sending task result for task 0affcd87-79f5-6b1f-5706-00000000083e 51385 1727204602.32942: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000083e 51385 1727204602.32945: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204602.33000: no more pending results, returning what we have 51385 1727204602.33004: results queue empty 51385 1727204602.33005: checking for any_errors_fatal 51385 1727204602.33012: done checking for any_errors_fatal 51385 1727204602.33013: checking for max_fail_percentage 51385 1727204602.33014: done checking for max_fail_percentage 51385 1727204602.33015: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.33016: done checking to see if all hosts have failed 51385 1727204602.33016: getting the remaining hosts for this loop 51385 1727204602.33018: done getting the remaining hosts for this loop 51385 1727204602.33021: getting the next task for host managed-node1 51385 1727204602.33029: done getting next task for host managed-node1 51385 1727204602.33031: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 51385 1727204602.33035: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.33041: getting variables 51385 1727204602.33042: in VariableManager get_vars() 51385 1727204602.33091: Calling all_inventory to load vars for managed-node1 51385 1727204602.33094: Calling groups_inventory to load vars for managed-node1 51385 1727204602.33096: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.33107: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.33109: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.33111: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.34083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.35016: done with get_vars() 51385 1727204602.35034: done getting variables 51385 1727204602.35084: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.35172: variable 'profile' from source: include params 51385 1727204602.35176: variable 'item' from source: include params 51385 1727204602.35218: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.034) 0:00:20.756 ***** 51385 1727204602.35244: entering _queue_task() for managed-node1/set_fact 51385 1727204602.35487: worker is 1 (out of 1 available) 51385 1727204602.35502: exiting _queue_task() for managed-node1/set_fact 51385 1727204602.35513: done queuing things up, now waiting for results queue to drain 51385 1727204602.35514: waiting for pending results... 51385 1727204602.35700: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 51385 1727204602.35777: in run() - task 0affcd87-79f5-6b1f-5706-00000000083f 51385 1727204602.35787: variable 'ansible_search_path' from source: unknown 51385 1727204602.35791: variable 'ansible_search_path' from source: unknown 51385 1727204602.35820: calling self._execute() 51385 1727204602.35896: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.35899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.35908: variable 'omit' from source: magic vars 51385 1727204602.36180: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.36191: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.36276: variable 'profile_stat' from source: set_fact 51385 1727204602.36286: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204602.36291: when evaluation is False, skipping this task 51385 1727204602.36294: _execute() done 51385 1727204602.36297: dumping result to json 51385 1727204602.36299: done dumping result, returning 51385 1727204602.36303: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [0affcd87-79f5-6b1f-5706-00000000083f] 51385 1727204602.36310: sending task result for task 0affcd87-79f5-6b1f-5706-00000000083f 51385 1727204602.36398: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000083f 51385 1727204602.36401: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204602.36462: no more pending results, returning what we have 51385 1727204602.36468: results queue empty 51385 1727204602.36469: checking for any_errors_fatal 51385 1727204602.36478: done checking for any_errors_fatal 51385 1727204602.36479: checking for max_fail_percentage 51385 1727204602.36481: done checking for max_fail_percentage 51385 1727204602.36482: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.36482: done checking to see if all hosts have failed 51385 1727204602.36483: getting the remaining hosts for this loop 51385 1727204602.36485: done getting the remaining hosts for this loop 51385 1727204602.36488: getting the next task for host managed-node1 51385 1727204602.36494: done getting next task for host managed-node1 51385 1727204602.36496: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 51385 1727204602.36500: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.36503: getting variables 51385 1727204602.36505: in VariableManager get_vars() 51385 1727204602.36549: Calling all_inventory to load vars for managed-node1 51385 1727204602.36552: Calling groups_inventory to load vars for managed-node1 51385 1727204602.36554: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.36568: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.36571: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.36574: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.37787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.39692: done with get_vars() 51385 1727204602.39718: done getting variables 51385 1727204602.39793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.39917: variable 'profile' from source: include params 51385 1727204602.39921: variable 'item' from source: include params 51385 1727204602.39989: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.047) 0:00:20.804 ***** 51385 1727204602.40056: entering _queue_task() for managed-node1/command 51385 1727204602.40303: worker is 1 (out of 1 available) 51385 1727204602.40317: exiting _queue_task() for managed-node1/command 51385 1727204602.40329: done queuing things up, now waiting for results queue to drain 51385 1727204602.40330: waiting for pending results... 51385 1727204602.40516: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 51385 1727204602.40605: in run() - task 0affcd87-79f5-6b1f-5706-000000000840 51385 1727204602.40613: variable 'ansible_search_path' from source: unknown 51385 1727204602.40617: variable 'ansible_search_path' from source: unknown 51385 1727204602.40650: calling self._execute() 51385 1727204602.40726: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.40729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.40738: variable 'omit' from source: magic vars 51385 1727204602.41320: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.41323: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.41325: variable 'profile_stat' from source: set_fact 51385 1727204602.41327: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204602.41329: when evaluation is False, skipping this task 51385 1727204602.41331: _execute() done 51385 1727204602.41332: dumping result to json 51385 1727204602.41334: done dumping result, returning 51385 1727204602.41336: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [0affcd87-79f5-6b1f-5706-000000000840] 51385 1727204602.41338: sending task result for task 0affcd87-79f5-6b1f-5706-000000000840 51385 1727204602.41403: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000840 51385 1727204602.41406: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204602.41448: no more pending results, returning what we have 51385 1727204602.41451: results queue empty 51385 1727204602.41452: checking for any_errors_fatal 51385 1727204602.41457: done checking for any_errors_fatal 51385 1727204602.41458: checking for max_fail_percentage 51385 1727204602.41462: done checking for max_fail_percentage 51385 1727204602.41463: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.41467: done checking to see if all hosts have failed 51385 1727204602.41468: getting the remaining hosts for this loop 51385 1727204602.41470: done getting the remaining hosts for this loop 51385 1727204602.41473: getting the next task for host managed-node1 51385 1727204602.41478: done getting next task for host managed-node1 51385 1727204602.41481: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 51385 1727204602.41486: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.41490: getting variables 51385 1727204602.41491: in VariableManager get_vars() 51385 1727204602.41549: Calling all_inventory to load vars for managed-node1 51385 1727204602.41552: Calling groups_inventory to load vars for managed-node1 51385 1727204602.41555: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.41570: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.41573: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.41576: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.42726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.43672: done with get_vars() 51385 1727204602.43697: done getting variables 51385 1727204602.43746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.43838: variable 'profile' from source: include params 51385 1727204602.43841: variable 'item' from source: include params 51385 1727204602.43887: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.038) 0:00:20.843 ***** 51385 1727204602.43910: entering _queue_task() for managed-node1/set_fact 51385 1727204602.44229: worker is 1 (out of 1 available) 51385 1727204602.44242: exiting _queue_task() for managed-node1/set_fact 51385 1727204602.44254: done queuing things up, now waiting for results queue to drain 51385 1727204602.44256: waiting for pending results... 51385 1727204602.44582: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 51385 1727204602.44774: in run() - task 0affcd87-79f5-6b1f-5706-000000000841 51385 1727204602.44778: variable 'ansible_search_path' from source: unknown 51385 1727204602.44781: variable 'ansible_search_path' from source: unknown 51385 1727204602.44783: calling self._execute() 51385 1727204602.44882: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.44886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.44889: variable 'omit' from source: magic vars 51385 1727204602.45207: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.45211: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.45328: variable 'profile_stat' from source: set_fact 51385 1727204602.45339: Evaluated conditional (profile_stat.stat.exists): False 51385 1727204602.45342: when evaluation is False, skipping this task 51385 1727204602.45345: _execute() done 51385 1727204602.45347: dumping result to json 51385 1727204602.45358: done dumping result, returning 51385 1727204602.45365: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [0affcd87-79f5-6b1f-5706-000000000841] 51385 1727204602.45371: sending task result for task 0affcd87-79f5-6b1f-5706-000000000841 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 51385 1727204602.45510: no more pending results, returning what we have 51385 1727204602.45515: results queue empty 51385 1727204602.45516: checking for any_errors_fatal 51385 1727204602.45525: done checking for any_errors_fatal 51385 1727204602.45526: checking for max_fail_percentage 51385 1727204602.45528: done checking for max_fail_percentage 51385 1727204602.45530: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.45531: done checking to see if all hosts have failed 51385 1727204602.45532: getting the remaining hosts for this loop 51385 1727204602.45533: done getting the remaining hosts for this loop 51385 1727204602.45538: getting the next task for host managed-node1 51385 1727204602.45547: done getting next task for host managed-node1 51385 1727204602.45549: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 51385 1727204602.45554: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.45558: getting variables 51385 1727204602.45560: in VariableManager get_vars() 51385 1727204602.45606: Calling all_inventory to load vars for managed-node1 51385 1727204602.45609: Calling groups_inventory to load vars for managed-node1 51385 1727204602.45612: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.45618: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000841 51385 1727204602.45632: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.45635: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.45641: WORKER PROCESS EXITING 51385 1727204602.45646: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.47536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.49201: done with get_vars() 51385 1727204602.49228: done getting variables 51385 1727204602.49291: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.49413: variable 'profile' from source: include params 51385 1727204602.49417: variable 'item' from source: include params 51385 1727204602.49476: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.055) 0:00:20.899 ***** 51385 1727204602.49507: entering _queue_task() for managed-node1/assert 51385 1727204602.49833: worker is 1 (out of 1 available) 51385 1727204602.49848: exiting _queue_task() for managed-node1/assert 51385 1727204602.49860: done queuing things up, now waiting for results queue to drain 51385 1727204602.49862: waiting for pending results... 51385 1727204602.50151: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'lsr101.90' 51385 1727204602.50249: in run() - task 0affcd87-79f5-6b1f-5706-0000000006c0 51385 1727204602.50258: variable 'ansible_search_path' from source: unknown 51385 1727204602.50267: variable 'ansible_search_path' from source: unknown 51385 1727204602.50302: calling self._execute() 51385 1727204602.50402: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.50407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.50421: variable 'omit' from source: magic vars 51385 1727204602.50782: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.50794: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.50801: variable 'omit' from source: magic vars 51385 1727204602.50841: variable 'omit' from source: magic vars 51385 1727204602.50941: variable 'profile' from source: include params 51385 1727204602.50945: variable 'item' from source: include params 51385 1727204602.51012: variable 'item' from source: include params 51385 1727204602.51030: variable 'omit' from source: magic vars 51385 1727204602.51074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204602.51113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204602.51132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204602.51148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.51163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.51195: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204602.51198: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.51201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.51303: Set connection var ansible_pipelining to False 51385 1727204602.51307: Set connection var ansible_shell_type to sh 51385 1727204602.51317: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204602.51324: Set connection var ansible_timeout to 10 51385 1727204602.51327: Set connection var ansible_connection to ssh 51385 1727204602.51333: Set connection var ansible_shell_executable to /bin/sh 51385 1727204602.51354: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.51357: variable 'ansible_connection' from source: unknown 51385 1727204602.51362: variable 'ansible_module_compression' from source: unknown 51385 1727204602.51367: variable 'ansible_shell_type' from source: unknown 51385 1727204602.51369: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.51371: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.51374: variable 'ansible_pipelining' from source: unknown 51385 1727204602.51376: variable 'ansible_timeout' from source: unknown 51385 1727204602.51378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.51516: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204602.51527: variable 'omit' from source: magic vars 51385 1727204602.51532: starting attempt loop 51385 1727204602.51535: running the handler 51385 1727204602.51649: variable 'lsr_net_profile_exists' from source: set_fact 51385 1727204602.51653: Evaluated conditional (lsr_net_profile_exists): True 51385 1727204602.51664: handler run complete 51385 1727204602.51677: attempt loop complete, returning result 51385 1727204602.51680: _execute() done 51385 1727204602.51682: dumping result to json 51385 1727204602.51685: done dumping result, returning 51385 1727204602.51692: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'lsr101.90' [0affcd87-79f5-6b1f-5706-0000000006c0] 51385 1727204602.51698: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c0 51385 1727204602.51791: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c0 51385 1727204602.51794: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204602.51875: no more pending results, returning what we have 51385 1727204602.51879: results queue empty 51385 1727204602.51880: checking for any_errors_fatal 51385 1727204602.51886: done checking for any_errors_fatal 51385 1727204602.51887: checking for max_fail_percentage 51385 1727204602.51889: done checking for max_fail_percentage 51385 1727204602.51890: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.51891: done checking to see if all hosts have failed 51385 1727204602.51892: getting the remaining hosts for this loop 51385 1727204602.51893: done getting the remaining hosts for this loop 51385 1727204602.51897: getting the next task for host managed-node1 51385 1727204602.51904: done getting next task for host managed-node1 51385 1727204602.51907: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 51385 1727204602.51911: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.51915: getting variables 51385 1727204602.51917: in VariableManager get_vars() 51385 1727204602.51960: Calling all_inventory to load vars for managed-node1 51385 1727204602.51965: Calling groups_inventory to load vars for managed-node1 51385 1727204602.51968: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.51979: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.51982: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.51985: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.53594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.55245: done with get_vars() 51385 1727204602.55279: done getting variables 51385 1727204602.55345: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.55461: variable 'profile' from source: include params 51385 1727204602.55467: variable 'item' from source: include params 51385 1727204602.55518: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.060) 0:00:20.959 ***** 51385 1727204602.55559: entering _queue_task() for managed-node1/assert 51385 1727204602.55885: worker is 1 (out of 1 available) 51385 1727204602.55898: exiting _queue_task() for managed-node1/assert 51385 1727204602.55910: done queuing things up, now waiting for results queue to drain 51385 1727204602.55911: waiting for pending results... 51385 1727204602.56193: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 51385 1727204602.56287: in run() - task 0affcd87-79f5-6b1f-5706-0000000006c1 51385 1727204602.56299: variable 'ansible_search_path' from source: unknown 51385 1727204602.56303: variable 'ansible_search_path' from source: unknown 51385 1727204602.56340: calling self._execute() 51385 1727204602.56433: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.56438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.56449: variable 'omit' from source: magic vars 51385 1727204602.56819: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.56830: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.56837: variable 'omit' from source: magic vars 51385 1727204602.56879: variable 'omit' from source: magic vars 51385 1727204602.56982: variable 'profile' from source: include params 51385 1727204602.56985: variable 'item' from source: include params 51385 1727204602.57051: variable 'item' from source: include params 51385 1727204602.57070: variable 'omit' from source: magic vars 51385 1727204602.57115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204602.57152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204602.57173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204602.57190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.57201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.57235: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204602.57239: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.57242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.57346: Set connection var ansible_pipelining to False 51385 1727204602.57349: Set connection var ansible_shell_type to sh 51385 1727204602.57361: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204602.57367: Set connection var ansible_timeout to 10 51385 1727204602.57370: Set connection var ansible_connection to ssh 51385 1727204602.57376: Set connection var ansible_shell_executable to /bin/sh 51385 1727204602.57398: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.57401: variable 'ansible_connection' from source: unknown 51385 1727204602.57404: variable 'ansible_module_compression' from source: unknown 51385 1727204602.57407: variable 'ansible_shell_type' from source: unknown 51385 1727204602.57409: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.57411: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.57413: variable 'ansible_pipelining' from source: unknown 51385 1727204602.57417: variable 'ansible_timeout' from source: unknown 51385 1727204602.57422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.57561: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204602.57571: variable 'omit' from source: magic vars 51385 1727204602.57576: starting attempt loop 51385 1727204602.57579: running the handler 51385 1727204602.57692: variable 'lsr_net_profile_ansible_managed' from source: set_fact 51385 1727204602.57696: Evaluated conditional (lsr_net_profile_ansible_managed): True 51385 1727204602.57704: handler run complete 51385 1727204602.57718: attempt loop complete, returning result 51385 1727204602.57721: _execute() done 51385 1727204602.57723: dumping result to json 51385 1727204602.57726: done dumping result, returning 51385 1727204602.57732: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [0affcd87-79f5-6b1f-5706-0000000006c1] 51385 1727204602.57738: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c1 51385 1727204602.57826: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c1 51385 1727204602.57830: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204602.57914: no more pending results, returning what we have 51385 1727204602.57918: results queue empty 51385 1727204602.57919: checking for any_errors_fatal 51385 1727204602.57927: done checking for any_errors_fatal 51385 1727204602.57928: checking for max_fail_percentage 51385 1727204602.57930: done checking for max_fail_percentage 51385 1727204602.57931: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.57932: done checking to see if all hosts have failed 51385 1727204602.57933: getting the remaining hosts for this loop 51385 1727204602.57935: done getting the remaining hosts for this loop 51385 1727204602.57939: getting the next task for host managed-node1 51385 1727204602.57946: done getting next task for host managed-node1 51385 1727204602.57948: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 51385 1727204602.57952: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.57957: getting variables 51385 1727204602.57958: in VariableManager get_vars() 51385 1727204602.58004: Calling all_inventory to load vars for managed-node1 51385 1727204602.58007: Calling groups_inventory to load vars for managed-node1 51385 1727204602.58009: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.58021: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.58024: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.58026: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.59809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.61463: done with get_vars() 51385 1727204602.61493: done getting variables 51385 1727204602.61559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204602.61677: variable 'profile' from source: include params 51385 1727204602.61681: variable 'item' from source: include params 51385 1727204602.61740: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.062) 0:00:21.021 ***** 51385 1727204602.61779: entering _queue_task() for managed-node1/assert 51385 1727204602.62097: worker is 1 (out of 1 available) 51385 1727204602.62110: exiting _queue_task() for managed-node1/assert 51385 1727204602.62122: done queuing things up, now waiting for results queue to drain 51385 1727204602.62123: waiting for pending results... 51385 1727204602.62405: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in lsr101.90 51385 1727204602.62497: in run() - task 0affcd87-79f5-6b1f-5706-0000000006c2 51385 1727204602.62509: variable 'ansible_search_path' from source: unknown 51385 1727204602.62512: variable 'ansible_search_path' from source: unknown 51385 1727204602.62549: calling self._execute() 51385 1727204602.62643: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.62649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.62661: variable 'omit' from source: magic vars 51385 1727204602.63025: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.63037: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.63044: variable 'omit' from source: magic vars 51385 1727204602.63086: variable 'omit' from source: magic vars 51385 1727204602.63186: variable 'profile' from source: include params 51385 1727204602.63190: variable 'item' from source: include params 51385 1727204602.63257: variable 'item' from source: include params 51385 1727204602.63276: variable 'omit' from source: magic vars 51385 1727204602.63318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204602.63361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204602.63381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204602.63399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.63410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.63445: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204602.63448: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.63451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.63550: Set connection var ansible_pipelining to False 51385 1727204602.63553: Set connection var ansible_shell_type to sh 51385 1727204602.63566: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204602.63574: Set connection var ansible_timeout to 10 51385 1727204602.63577: Set connection var ansible_connection to ssh 51385 1727204602.63582: Set connection var ansible_shell_executable to /bin/sh 51385 1727204602.63605: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.63610: variable 'ansible_connection' from source: unknown 51385 1727204602.63613: variable 'ansible_module_compression' from source: unknown 51385 1727204602.63615: variable 'ansible_shell_type' from source: unknown 51385 1727204602.63617: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.63619: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.63621: variable 'ansible_pipelining' from source: unknown 51385 1727204602.63624: variable 'ansible_timeout' from source: unknown 51385 1727204602.63629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.63767: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204602.63778: variable 'omit' from source: magic vars 51385 1727204602.63785: starting attempt loop 51385 1727204602.63788: running the handler 51385 1727204602.63901: variable 'lsr_net_profile_fingerprint' from source: set_fact 51385 1727204602.63905: Evaluated conditional (lsr_net_profile_fingerprint): True 51385 1727204602.63912: handler run complete 51385 1727204602.63926: attempt loop complete, returning result 51385 1727204602.63929: _execute() done 51385 1727204602.63932: dumping result to json 51385 1727204602.63934: done dumping result, returning 51385 1727204602.63940: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in lsr101.90 [0affcd87-79f5-6b1f-5706-0000000006c2] 51385 1727204602.63947: sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c2 51385 1727204602.64033: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000006c2 51385 1727204602.64038: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 51385 1727204602.64092: no more pending results, returning what we have 51385 1727204602.64096: results queue empty 51385 1727204602.64097: checking for any_errors_fatal 51385 1727204602.64105: done checking for any_errors_fatal 51385 1727204602.64106: checking for max_fail_percentage 51385 1727204602.64108: done checking for max_fail_percentage 51385 1727204602.64109: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.64110: done checking to see if all hosts have failed 51385 1727204602.64111: getting the remaining hosts for this loop 51385 1727204602.64113: done getting the remaining hosts for this loop 51385 1727204602.64117: getting the next task for host managed-node1 51385 1727204602.64126: done getting next task for host managed-node1 51385 1727204602.64129: ^ task is: TASK: TEARDOWN: remove profiles. 51385 1727204602.64131: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.64135: getting variables 51385 1727204602.64137: in VariableManager get_vars() 51385 1727204602.64183: Calling all_inventory to load vars for managed-node1 51385 1727204602.64186: Calling groups_inventory to load vars for managed-node1 51385 1727204602.64189: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.64200: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.64203: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.64206: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.65885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.67519: done with get_vars() 51385 1727204602.67550: done getting variables 51385 1727204602.67619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.058) 0:00:21.080 ***** 51385 1727204602.67652: entering _queue_task() for managed-node1/debug 51385 1727204602.67978: worker is 1 (out of 1 available) 51385 1727204602.67992: exiting _queue_task() for managed-node1/debug 51385 1727204602.68004: done queuing things up, now waiting for results queue to drain 51385 1727204602.68005: waiting for pending results... 51385 1727204602.68285: running TaskExecutor() for managed-node1/TASK: TEARDOWN: remove profiles. 51385 1727204602.68369: in run() - task 0affcd87-79f5-6b1f-5706-00000000005d 51385 1727204602.68378: variable 'ansible_search_path' from source: unknown 51385 1727204602.68413: calling self._execute() 51385 1727204602.68508: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.68511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.68521: variable 'omit' from source: magic vars 51385 1727204602.68896: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.68908: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.68915: variable 'omit' from source: magic vars 51385 1727204602.68934: variable 'omit' from source: magic vars 51385 1727204602.68970: variable 'omit' from source: magic vars 51385 1727204602.69016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204602.69051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204602.69076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204602.69092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.69110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204602.69138: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204602.69141: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.69144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.69247: Set connection var ansible_pipelining to False 51385 1727204602.69251: Set connection var ansible_shell_type to sh 51385 1727204602.69262: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204602.69270: Set connection var ansible_timeout to 10 51385 1727204602.69272: Set connection var ansible_connection to ssh 51385 1727204602.69278: Set connection var ansible_shell_executable to /bin/sh 51385 1727204602.69302: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.69305: variable 'ansible_connection' from source: unknown 51385 1727204602.69308: variable 'ansible_module_compression' from source: unknown 51385 1727204602.69310: variable 'ansible_shell_type' from source: unknown 51385 1727204602.69317: variable 'ansible_shell_executable' from source: unknown 51385 1727204602.69320: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.69325: variable 'ansible_pipelining' from source: unknown 51385 1727204602.69327: variable 'ansible_timeout' from source: unknown 51385 1727204602.69331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.69468: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204602.69477: variable 'omit' from source: magic vars 51385 1727204602.69483: starting attempt loop 51385 1727204602.69486: running the handler 51385 1727204602.69535: handler run complete 51385 1727204602.69552: attempt loop complete, returning result 51385 1727204602.69555: _execute() done 51385 1727204602.69558: dumping result to json 51385 1727204602.69563: done dumping result, returning 51385 1727204602.69567: done running TaskExecutor() for managed-node1/TASK: TEARDOWN: remove profiles. [0affcd87-79f5-6b1f-5706-00000000005d] 51385 1727204602.69574: sending task result for task 0affcd87-79f5-6b1f-5706-00000000005d 51385 1727204602.69666: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000005d 51385 1727204602.69669: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 51385 1727204602.69719: no more pending results, returning what we have 51385 1727204602.69723: results queue empty 51385 1727204602.69724: checking for any_errors_fatal 51385 1727204602.69732: done checking for any_errors_fatal 51385 1727204602.69733: checking for max_fail_percentage 51385 1727204602.69735: done checking for max_fail_percentage 51385 1727204602.69736: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.69737: done checking to see if all hosts have failed 51385 1727204602.69738: getting the remaining hosts for this loop 51385 1727204602.69740: done getting the remaining hosts for this loop 51385 1727204602.69743: getting the next task for host managed-node1 51385 1727204602.69750: done getting next task for host managed-node1 51385 1727204602.69758: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51385 1727204602.69761: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.69785: getting variables 51385 1727204602.69787: in VariableManager get_vars() 51385 1727204602.69829: Calling all_inventory to load vars for managed-node1 51385 1727204602.69831: Calling groups_inventory to load vars for managed-node1 51385 1727204602.69834: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.69844: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.69847: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.69850: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.71607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.73267: done with get_vars() 51385 1727204602.73294: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.057) 0:00:21.137 ***** 51385 1727204602.73395: entering _queue_task() for managed-node1/include_tasks 51385 1727204602.73724: worker is 1 (out of 1 available) 51385 1727204602.73738: exiting _queue_task() for managed-node1/include_tasks 51385 1727204602.73751: done queuing things up, now waiting for results queue to drain 51385 1727204602.73752: waiting for pending results... 51385 1727204602.74046: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51385 1727204602.74164: in run() - task 0affcd87-79f5-6b1f-5706-000000000065 51385 1727204602.74177: variable 'ansible_search_path' from source: unknown 51385 1727204602.74180: variable 'ansible_search_path' from source: unknown 51385 1727204602.74220: calling self._execute() 51385 1727204602.74311: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.74316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.74327: variable 'omit' from source: magic vars 51385 1727204602.74689: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.74700: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.74706: _execute() done 51385 1727204602.74711: dumping result to json 51385 1727204602.74714: done dumping result, returning 51385 1727204602.74720: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-6b1f-5706-000000000065] 51385 1727204602.74726: sending task result for task 0affcd87-79f5-6b1f-5706-000000000065 51385 1727204602.74818: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000065 51385 1727204602.74821: WORKER PROCESS EXITING 51385 1727204602.74887: no more pending results, returning what we have 51385 1727204602.74892: in VariableManager get_vars() 51385 1727204602.74939: Calling all_inventory to load vars for managed-node1 51385 1727204602.74942: Calling groups_inventory to load vars for managed-node1 51385 1727204602.74945: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.74958: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.74962: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.74966: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.76565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.78317: done with get_vars() 51385 1727204602.78339: variable 'ansible_search_path' from source: unknown 51385 1727204602.78341: variable 'ansible_search_path' from source: unknown 51385 1727204602.78385: we have included files to process 51385 1727204602.78387: generating all_blocks data 51385 1727204602.78390: done generating all_blocks data 51385 1727204602.78395: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204602.78397: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204602.78400: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 51385 1727204602.78998: done processing included file 51385 1727204602.79000: iterating over new_blocks loaded from include file 51385 1727204602.79002: in VariableManager get_vars() 51385 1727204602.79029: done with get_vars() 51385 1727204602.79031: filtering new block on tags 51385 1727204602.79051: done filtering new block on tags 51385 1727204602.79054: in VariableManager get_vars() 51385 1727204602.79081: done with get_vars() 51385 1727204602.79083: filtering new block on tags 51385 1727204602.79104: done filtering new block on tags 51385 1727204602.79107: in VariableManager get_vars() 51385 1727204602.79132: done with get_vars() 51385 1727204602.79134: filtering new block on tags 51385 1727204602.79153: done filtering new block on tags 51385 1727204602.79156: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 51385 1727204602.79162: extending task lists for all hosts with included blocks 51385 1727204602.79978: done extending task lists 51385 1727204602.79980: done processing included files 51385 1727204602.79981: results queue empty 51385 1727204602.79981: checking for any_errors_fatal 51385 1727204602.79984: done checking for any_errors_fatal 51385 1727204602.79985: checking for max_fail_percentage 51385 1727204602.79986: done checking for max_fail_percentage 51385 1727204602.79986: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.79987: done checking to see if all hosts have failed 51385 1727204602.79988: getting the remaining hosts for this loop 51385 1727204602.79989: done getting the remaining hosts for this loop 51385 1727204602.79991: getting the next task for host managed-node1 51385 1727204602.79995: done getting next task for host managed-node1 51385 1727204602.79997: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 51385 1727204602.80000: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.80010: getting variables 51385 1727204602.80011: in VariableManager get_vars() 51385 1727204602.80027: Calling all_inventory to load vars for managed-node1 51385 1727204602.80029: Calling groups_inventory to load vars for managed-node1 51385 1727204602.80031: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.80036: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.80040: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.80043: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.81282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.82969: done with get_vars() 51385 1727204602.82998: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.096) 0:00:21.234 ***** 51385 1727204602.83084: entering _queue_task() for managed-node1/setup 51385 1727204602.83427: worker is 1 (out of 1 available) 51385 1727204602.83440: exiting _queue_task() for managed-node1/setup 51385 1727204602.83453: done queuing things up, now waiting for results queue to drain 51385 1727204602.83454: waiting for pending results... 51385 1727204602.83743: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 51385 1727204602.83899: in run() - task 0affcd87-79f5-6b1f-5706-000000000883 51385 1727204602.83917: variable 'ansible_search_path' from source: unknown 51385 1727204602.83921: variable 'ansible_search_path' from source: unknown 51385 1727204602.83956: calling self._execute() 51385 1727204602.84045: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.84051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.84063: variable 'omit' from source: magic vars 51385 1727204602.84425: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.84438: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.84656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204602.87183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204602.87255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204602.87298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204602.87330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204602.87356: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204602.87436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204602.87467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204602.87497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204602.87535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204602.87548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204602.87605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204602.87626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204602.87651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204602.87692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204602.87711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204602.87871: variable '__network_required_facts' from source: role '' defaults 51385 1727204602.87880: variable 'ansible_facts' from source: unknown 51385 1727204602.88634: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 51385 1727204602.88638: when evaluation is False, skipping this task 51385 1727204602.88641: _execute() done 51385 1727204602.88644: dumping result to json 51385 1727204602.88646: done dumping result, returning 51385 1727204602.88654: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-6b1f-5706-000000000883] 51385 1727204602.88663: sending task result for task 0affcd87-79f5-6b1f-5706-000000000883 51385 1727204602.88752: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000883 51385 1727204602.88754: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204602.88819: no more pending results, returning what we have 51385 1727204602.88824: results queue empty 51385 1727204602.88825: checking for any_errors_fatal 51385 1727204602.88827: done checking for any_errors_fatal 51385 1727204602.88827: checking for max_fail_percentage 51385 1727204602.88829: done checking for max_fail_percentage 51385 1727204602.88830: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.88831: done checking to see if all hosts have failed 51385 1727204602.88832: getting the remaining hosts for this loop 51385 1727204602.88834: done getting the remaining hosts for this loop 51385 1727204602.88839: getting the next task for host managed-node1 51385 1727204602.88849: done getting next task for host managed-node1 51385 1727204602.88853: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 51385 1727204602.88858: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.88879: getting variables 51385 1727204602.88881: in VariableManager get_vars() 51385 1727204602.88925: Calling all_inventory to load vars for managed-node1 51385 1727204602.88928: Calling groups_inventory to load vars for managed-node1 51385 1727204602.88930: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.88941: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.88944: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.88947: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.90635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.92283: done with get_vars() 51385 1727204602.92308: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.093) 0:00:21.328 ***** 51385 1727204602.92421: entering _queue_task() for managed-node1/stat 51385 1727204602.92743: worker is 1 (out of 1 available) 51385 1727204602.92755: exiting _queue_task() for managed-node1/stat 51385 1727204602.92769: done queuing things up, now waiting for results queue to drain 51385 1727204602.92770: waiting for pending results... 51385 1727204602.93045: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 51385 1727204602.93190: in run() - task 0affcd87-79f5-6b1f-5706-000000000885 51385 1727204602.93202: variable 'ansible_search_path' from source: unknown 51385 1727204602.93205: variable 'ansible_search_path' from source: unknown 51385 1727204602.93245: calling self._execute() 51385 1727204602.93328: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.93334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.93344: variable 'omit' from source: magic vars 51385 1727204602.93694: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.93707: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.93869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204602.94138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204602.94185: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204602.94220: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204602.94252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204602.94337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204602.94366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204602.94392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204602.94421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204602.94506: variable '__network_is_ostree' from source: set_fact 51385 1727204602.94512: Evaluated conditional (not __network_is_ostree is defined): False 51385 1727204602.94521: when evaluation is False, skipping this task 51385 1727204602.94524: _execute() done 51385 1727204602.94528: dumping result to json 51385 1727204602.94531: done dumping result, returning 51385 1727204602.94539: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-6b1f-5706-000000000885] 51385 1727204602.94546: sending task result for task 0affcd87-79f5-6b1f-5706-000000000885 51385 1727204602.94642: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000885 51385 1727204602.94645: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 51385 1727204602.94699: no more pending results, returning what we have 51385 1727204602.94703: results queue empty 51385 1727204602.94704: checking for any_errors_fatal 51385 1727204602.94710: done checking for any_errors_fatal 51385 1727204602.94711: checking for max_fail_percentage 51385 1727204602.94713: done checking for max_fail_percentage 51385 1727204602.94714: checking to see if all hosts have failed and the running result is not ok 51385 1727204602.94716: done checking to see if all hosts have failed 51385 1727204602.94716: getting the remaining hosts for this loop 51385 1727204602.94719: done getting the remaining hosts for this loop 51385 1727204602.94723: getting the next task for host managed-node1 51385 1727204602.94730: done getting next task for host managed-node1 51385 1727204602.94734: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 51385 1727204602.94739: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204602.94759: getting variables 51385 1727204602.94761: in VariableManager get_vars() 51385 1727204602.94804: Calling all_inventory to load vars for managed-node1 51385 1727204602.94807: Calling groups_inventory to load vars for managed-node1 51385 1727204602.94810: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204602.94820: Calling all_plugins_play to load vars for managed-node1 51385 1727204602.94824: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204602.94827: Calling groups_plugins_play to load vars for managed-node1 51385 1727204602.96455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204602.98093: done with get_vars() 51385 1727204602.98128: done getting variables 51385 1727204602.98195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.058) 0:00:21.386 ***** 51385 1727204602.98234: entering _queue_task() for managed-node1/set_fact 51385 1727204602.98568: worker is 1 (out of 1 available) 51385 1727204602.98581: exiting _queue_task() for managed-node1/set_fact 51385 1727204602.98592: done queuing things up, now waiting for results queue to drain 51385 1727204602.98594: waiting for pending results... 51385 1727204602.98878: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 51385 1727204602.99051: in run() - task 0affcd87-79f5-6b1f-5706-000000000886 51385 1727204602.99073: variable 'ansible_search_path' from source: unknown 51385 1727204602.99080: variable 'ansible_search_path' from source: unknown 51385 1727204602.99118: calling self._execute() 51385 1727204602.99213: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204602.99224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204602.99237: variable 'omit' from source: magic vars 51385 1727204602.99611: variable 'ansible_distribution_major_version' from source: facts 51385 1727204602.99629: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204602.99802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204603.00075: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204603.00127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204603.00166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204603.00204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204603.00297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204603.00322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204603.00352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204603.00380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204603.00466: variable '__network_is_ostree' from source: set_fact 51385 1727204603.00479: Evaluated conditional (not __network_is_ostree is defined): False 51385 1727204603.00485: when evaluation is False, skipping this task 51385 1727204603.00491: _execute() done 51385 1727204603.00496: dumping result to json 51385 1727204603.00501: done dumping result, returning 51385 1727204603.00510: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-6b1f-5706-000000000886] 51385 1727204603.00519: sending task result for task 0affcd87-79f5-6b1f-5706-000000000886 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 51385 1727204603.00677: no more pending results, returning what we have 51385 1727204603.00682: results queue empty 51385 1727204603.00684: checking for any_errors_fatal 51385 1727204603.00689: done checking for any_errors_fatal 51385 1727204603.00690: checking for max_fail_percentage 51385 1727204603.00692: done checking for max_fail_percentage 51385 1727204603.00693: checking to see if all hosts have failed and the running result is not ok 51385 1727204603.00694: done checking to see if all hosts have failed 51385 1727204603.00695: getting the remaining hosts for this loop 51385 1727204603.00697: done getting the remaining hosts for this loop 51385 1727204603.00702: getting the next task for host managed-node1 51385 1727204603.00712: done getting next task for host managed-node1 51385 1727204603.00716: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 51385 1727204603.00721: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204603.00744: getting variables 51385 1727204603.00746: in VariableManager get_vars() 51385 1727204603.00795: Calling all_inventory to load vars for managed-node1 51385 1727204603.00798: Calling groups_inventory to load vars for managed-node1 51385 1727204603.00801: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204603.00812: Calling all_plugins_play to load vars for managed-node1 51385 1727204603.00816: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204603.00819: Calling groups_plugins_play to load vars for managed-node1 51385 1727204603.01884: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000886 51385 1727204603.01888: WORKER PROCESS EXITING 51385 1727204603.07083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204603.08670: done with get_vars() 51385 1727204603.08699: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.105) 0:00:21.491 ***** 51385 1727204603.08792: entering _queue_task() for managed-node1/service_facts 51385 1727204603.09131: worker is 1 (out of 1 available) 51385 1727204603.09145: exiting _queue_task() for managed-node1/service_facts 51385 1727204603.09158: done queuing things up, now waiting for results queue to drain 51385 1727204603.09159: waiting for pending results... 51385 1727204603.09455: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 51385 1727204603.09639: in run() - task 0affcd87-79f5-6b1f-5706-000000000888 51385 1727204603.09660: variable 'ansible_search_path' from source: unknown 51385 1727204603.09671: variable 'ansible_search_path' from source: unknown 51385 1727204603.09716: calling self._execute() 51385 1727204603.09818: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204603.09837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204603.09852: variable 'omit' from source: magic vars 51385 1727204603.10251: variable 'ansible_distribution_major_version' from source: facts 51385 1727204603.10275: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204603.10286: variable 'omit' from source: magic vars 51385 1727204603.10366: variable 'omit' from source: magic vars 51385 1727204603.10412: variable 'omit' from source: magic vars 51385 1727204603.10459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204603.10507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204603.10535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204603.10559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204603.10580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204603.10620: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204603.10626: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204603.10632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204603.10729: Set connection var ansible_pipelining to False 51385 1727204603.10737: Set connection var ansible_shell_type to sh 51385 1727204603.10751: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204603.10765: Set connection var ansible_timeout to 10 51385 1727204603.10772: Set connection var ansible_connection to ssh 51385 1727204603.10780: Set connection var ansible_shell_executable to /bin/sh 51385 1727204603.10808: variable 'ansible_shell_executable' from source: unknown 51385 1727204603.10816: variable 'ansible_connection' from source: unknown 51385 1727204603.10823: variable 'ansible_module_compression' from source: unknown 51385 1727204603.10829: variable 'ansible_shell_type' from source: unknown 51385 1727204603.10835: variable 'ansible_shell_executable' from source: unknown 51385 1727204603.10842: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204603.10849: variable 'ansible_pipelining' from source: unknown 51385 1727204603.10855: variable 'ansible_timeout' from source: unknown 51385 1727204603.10862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204603.11060: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204603.11079: variable 'omit' from source: magic vars 51385 1727204603.11088: starting attempt loop 51385 1727204603.11094: running the handler 51385 1727204603.11111: _low_level_execute_command(): starting 51385 1727204603.11120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204603.11887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204603.11907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.11922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.11939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.11984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204603.11997: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204603.12013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.12031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204603.12045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204603.12056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204603.12071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.12085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.12100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.12115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204603.12127: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204603.12139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.12219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204603.12244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204603.12259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204603.12352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204603.13950: stdout chunk (state=3): >>>/root <<< 51385 1727204603.14083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204603.14163: stderr chunk (state=3): >>><<< 51385 1727204603.14168: stdout chunk (state=3): >>><<< 51385 1727204603.14274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204603.14278: _low_level_execute_command(): starting 51385 1727204603.14281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097 `" && echo ansible-tmp-1727204603.1419275-52986-74898302484097="` echo /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097 `" ) && sleep 0' 51385 1727204603.14899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204603.14914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.14933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.14953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.14997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204603.15009: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204603.15023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.15045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204603.15058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204603.15072: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204603.15086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.15101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.15117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.15129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204603.15145: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204603.15160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.15238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204603.15271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204603.15290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204603.15384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204603.17222: stdout chunk (state=3): >>>ansible-tmp-1727204603.1419275-52986-74898302484097=/root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097 <<< 51385 1727204603.17337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204603.17399: stderr chunk (state=3): >>><<< 51385 1727204603.17401: stdout chunk (state=3): >>><<< 51385 1727204603.17430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204603.1419275-52986-74898302484097=/root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204603.17456: variable 'ansible_module_compression' from source: unknown 51385 1727204603.17495: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 51385 1727204603.17527: variable 'ansible_facts' from source: unknown 51385 1727204603.17588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/AnsiballZ_service_facts.py 51385 1727204603.17708: Sending initial data 51385 1727204603.17711: Sent initial data (161 bytes) 51385 1727204603.18725: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.18730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.18788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.18791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.18805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.18892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204603.18912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204603.18995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204603.20686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204603.20737: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204603.20789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmp5pdo0ua5 /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/AnsiballZ_service_facts.py <<< 51385 1727204603.20842: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204603.21992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204603.22081: stderr chunk (state=3): >>><<< 51385 1727204603.22085: stdout chunk (state=3): >>><<< 51385 1727204603.22106: done transferring module to remote 51385 1727204603.22117: _low_level_execute_command(): starting 51385 1727204603.22123: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/ /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/AnsiballZ_service_facts.py && sleep 0' 51385 1727204603.22867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.22897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.22905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.22913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.22922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.22928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.23005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204603.23008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204603.23076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204603.25557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204603.25571: stderr chunk (state=3): >>><<< 51385 1727204603.25575: stdout chunk (state=3): >>><<< 51385 1727204603.25581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204603.25584: _low_level_execute_command(): starting 51385 1727204603.25592: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/AnsiballZ_service_facts.py && sleep 0' 51385 1727204603.25649: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204603.25653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.25721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204603.25728: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 51385 1727204603.25730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204603.25732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204603.25735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204603.25802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204603.25806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204603.25822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204603.25893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.53690: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 51385 1727204604.53760: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 51385 1727204604.55083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204604.55087: stdout chunk (state=3): >>><<< 51385 1727204604.55090: stderr chunk (state=3): >>><<< 51385 1727204604.55774: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204604.55802: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204604.55817: _low_level_execute_command(): starting 51385 1727204604.55826: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204603.1419275-52986-74898302484097/ > /dev/null 2>&1 && sleep 0' 51385 1727204604.56491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204604.56505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.56519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.56537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.56586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.56600: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204604.56614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.56630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204604.56642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204604.56651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204604.56669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.56683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.56697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.56708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.56718: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204604.56730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.56811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204604.56833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.56849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.56939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.58800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204604.58803: stdout chunk (state=3): >>><<< 51385 1727204604.58806: stderr chunk (state=3): >>><<< 51385 1727204604.59073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204604.59077: handler run complete 51385 1727204604.59079: variable 'ansible_facts' from source: unknown 51385 1727204604.59185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204604.59852: variable 'ansible_facts' from source: unknown 51385 1727204604.60133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204604.60347: attempt loop complete, returning result 51385 1727204604.60356: _execute() done 51385 1727204604.60370: dumping result to json 51385 1727204604.60433: done dumping result, returning 51385 1727204604.60447: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-6b1f-5706-000000000888] 51385 1727204604.60457: sending task result for task 0affcd87-79f5-6b1f-5706-000000000888 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204604.61883: no more pending results, returning what we have 51385 1727204604.61886: results queue empty 51385 1727204604.61887: checking for any_errors_fatal 51385 1727204604.61895: done checking for any_errors_fatal 51385 1727204604.61896: checking for max_fail_percentage 51385 1727204604.61898: done checking for max_fail_percentage 51385 1727204604.61899: checking to see if all hosts have failed and the running result is not ok 51385 1727204604.61900: done checking to see if all hosts have failed 51385 1727204604.61901: getting the remaining hosts for this loop 51385 1727204604.61902: done getting the remaining hosts for this loop 51385 1727204604.61906: getting the next task for host managed-node1 51385 1727204604.61913: done getting next task for host managed-node1 51385 1727204604.61917: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 51385 1727204604.61921: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204604.61934: getting variables 51385 1727204604.61936: in VariableManager get_vars() 51385 1727204604.61979: Calling all_inventory to load vars for managed-node1 51385 1727204604.61982: Calling groups_inventory to load vars for managed-node1 51385 1727204604.61984: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204604.61995: Calling all_plugins_play to load vars for managed-node1 51385 1727204604.61998: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204604.62000: Calling groups_plugins_play to load vars for managed-node1 51385 1727204604.63474: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000888 51385 1727204604.63484: WORKER PROCESS EXITING 51385 1727204604.65191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204604.67301: done with get_vars() 51385 1727204604.67334: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:24 -0400 (0:00:01.594) 0:00:23.086 ***** 51385 1727204604.68216: entering _queue_task() for managed-node1/package_facts 51385 1727204604.68918: worker is 1 (out of 1 available) 51385 1727204604.68932: exiting _queue_task() for managed-node1/package_facts 51385 1727204604.68945: done queuing things up, now waiting for results queue to drain 51385 1727204604.68946: waiting for pending results... 51385 1727204604.70368: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 51385 1727204604.70528: in run() - task 0affcd87-79f5-6b1f-5706-000000000889 51385 1727204604.70542: variable 'ansible_search_path' from source: unknown 51385 1727204604.70546: variable 'ansible_search_path' from source: unknown 51385 1727204604.70590: calling self._execute() 51385 1727204604.70696: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204604.70703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204604.70713: variable 'omit' from source: magic vars 51385 1727204604.71811: variable 'ansible_distribution_major_version' from source: facts 51385 1727204604.71823: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204604.71831: variable 'omit' from source: magic vars 51385 1727204604.71905: variable 'omit' from source: magic vars 51385 1727204604.71940: variable 'omit' from source: magic vars 51385 1727204604.71983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204604.72020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204604.72042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204604.72062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204604.72072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204604.72103: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204604.72107: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204604.72109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204604.72203: Set connection var ansible_pipelining to False 51385 1727204604.72207: Set connection var ansible_shell_type to sh 51385 1727204604.72215: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204604.72223: Set connection var ansible_timeout to 10 51385 1727204604.72226: Set connection var ansible_connection to ssh 51385 1727204604.72231: Set connection var ansible_shell_executable to /bin/sh 51385 1727204604.72256: variable 'ansible_shell_executable' from source: unknown 51385 1727204604.72261: variable 'ansible_connection' from source: unknown 51385 1727204604.72266: variable 'ansible_module_compression' from source: unknown 51385 1727204604.72269: variable 'ansible_shell_type' from source: unknown 51385 1727204604.72272: variable 'ansible_shell_executable' from source: unknown 51385 1727204604.72274: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204604.72276: variable 'ansible_pipelining' from source: unknown 51385 1727204604.72279: variable 'ansible_timeout' from source: unknown 51385 1727204604.72281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204604.73192: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204604.73205: variable 'omit' from source: magic vars 51385 1727204604.73210: starting attempt loop 51385 1727204604.73213: running the handler 51385 1727204604.73227: _low_level_execute_command(): starting 51385 1727204604.73235: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204604.75076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.75087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.75235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.75240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.75258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204604.75266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.75345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204604.75423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.75427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.75511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.77055: stdout chunk (state=3): >>>/root <<< 51385 1727204604.77232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204604.77268: stderr chunk (state=3): >>><<< 51385 1727204604.77272: stdout chunk (state=3): >>><<< 51385 1727204604.77375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204604.77379: _low_level_execute_command(): starting 51385 1727204604.77382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911 `" && echo ansible-tmp-1727204604.772968-53025-192151970265911="` echo /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911 `" ) && sleep 0' 51385 1727204604.78062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204604.78083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.78100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.78118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.78180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.78193: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204604.78208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.78227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204604.78245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204604.78258: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204604.78277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.78292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.78309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.78320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.78330: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204604.78340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.78415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204604.78436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.78452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.78554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.80398: stdout chunk (state=3): >>>ansible-tmp-1727204604.772968-53025-192151970265911=/root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911 <<< 51385 1727204604.80623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204604.80627: stdout chunk (state=3): >>><<< 51385 1727204604.80630: stderr chunk (state=3): >>><<< 51385 1727204604.80872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204604.772968-53025-192151970265911=/root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204604.80877: variable 'ansible_module_compression' from source: unknown 51385 1727204604.80879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 51385 1727204604.80882: variable 'ansible_facts' from source: unknown 51385 1727204604.81038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/AnsiballZ_package_facts.py 51385 1727204604.81651: Sending initial data 51385 1727204604.81655: Sent initial data (161 bytes) 51385 1727204604.82592: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.82597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.82635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.82638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.82640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.82827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.82916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.83091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.84801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204604.84852: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204604.84906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpr73_ffj_ /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/AnsiballZ_package_facts.py <<< 51385 1727204604.84964: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204604.88130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204604.88220: stderr chunk (state=3): >>><<< 51385 1727204604.88224: stdout chunk (state=3): >>><<< 51385 1727204604.88250: done transferring module to remote 51385 1727204604.88265: _low_level_execute_command(): starting 51385 1727204604.88275: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/ /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/AnsiballZ_package_facts.py && sleep 0' 51385 1727204604.88956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204604.88969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.88980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.88993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.89039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.89046: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204604.89056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.89076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204604.89084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204604.89090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204604.89098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.89109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.89124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.89131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.89139: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204604.89151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.89229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204604.89250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.89268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.89348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204604.91109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204604.91113: stderr chunk (state=3): >>><<< 51385 1727204604.91121: stdout chunk (state=3): >>><<< 51385 1727204604.91137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204604.91140: _low_level_execute_command(): starting 51385 1727204604.91145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/AnsiballZ_package_facts.py && sleep 0' 51385 1727204604.92529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204604.92539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.92546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.92563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.92604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.92613: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204604.92630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.92641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204604.92649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204604.92652: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204604.92660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204604.92674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204604.92689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204604.92692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204604.92699: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204604.92708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204604.92784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204604.92802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204604.92813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204604.92908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204605.38748: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 51385 1727204605.38824: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 51385 1727204605.38829: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 51385 1727204605.38835: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 51385 1727204605.38840: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 51385 1727204605.38846: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 51385 1727204605.38850: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 51385 1727204605.38859: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-p<<< 51385 1727204605.38903: stdout chunk (state=3): >>>ubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{<<< 51385 1727204605.38908: stdout chunk (state=3): >>>"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "releas<<< 51385 1727204605.38914: stdout chunk (state=3): >>>e": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source":<<< 51385 1727204605.38944: stdout chunk (state=3): >>> "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "ep<<< 51385 1727204605.38958: stdout chunk (state=3): >>>och": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 51385 1727204605.40407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204605.40469: stderr chunk (state=3): >>><<< 51385 1727204605.40472: stdout chunk (state=3): >>><<< 51385 1727204605.40513: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204605.41946: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204605.41965: _low_level_execute_command(): starting 51385 1727204605.41969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204604.772968-53025-192151970265911/ > /dev/null 2>&1 && sleep 0' 51385 1727204605.42449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204605.42462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204605.42482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204605.42495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204605.42505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204605.42552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204605.42567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204605.42631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204605.44431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204605.44492: stderr chunk (state=3): >>><<< 51385 1727204605.44495: stdout chunk (state=3): >>><<< 51385 1727204605.44508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204605.44515: handler run complete 51385 1727204605.45110: variable 'ansible_facts' from source: unknown 51385 1727204605.45514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.47706: variable 'ansible_facts' from source: unknown 51385 1727204605.48196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.48690: attempt loop complete, returning result 51385 1727204605.48700: _execute() done 51385 1727204605.48703: dumping result to json 51385 1727204605.48831: done dumping result, returning 51385 1727204605.48840: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-6b1f-5706-000000000889] 51385 1727204605.48846: sending task result for task 0affcd87-79f5-6b1f-5706-000000000889 51385 1727204605.50185: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000889 51385 1727204605.50189: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204605.50278: no more pending results, returning what we have 51385 1727204605.50280: results queue empty 51385 1727204605.50281: checking for any_errors_fatal 51385 1727204605.50285: done checking for any_errors_fatal 51385 1727204605.50285: checking for max_fail_percentage 51385 1727204605.50286: done checking for max_fail_percentage 51385 1727204605.50287: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.50288: done checking to see if all hosts have failed 51385 1727204605.50288: getting the remaining hosts for this loop 51385 1727204605.50289: done getting the remaining hosts for this loop 51385 1727204605.50294: getting the next task for host managed-node1 51385 1727204605.50300: done getting next task for host managed-node1 51385 1727204605.50303: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51385 1727204605.50305: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.50312: getting variables 51385 1727204605.50313: in VariableManager get_vars() 51385 1727204605.50338: Calling all_inventory to load vars for managed-node1 51385 1727204605.50340: Calling groups_inventory to load vars for managed-node1 51385 1727204605.50342: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.50348: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.50350: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.50351: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.51117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.52070: done with get_vars() 51385 1727204605.52088: done getting variables 51385 1727204605.52133: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.839) 0:00:23.925 ***** 51385 1727204605.52162: entering _queue_task() for managed-node1/debug 51385 1727204605.52402: worker is 1 (out of 1 available) 51385 1727204605.52415: exiting _queue_task() for managed-node1/debug 51385 1727204605.52426: done queuing things up, now waiting for results queue to drain 51385 1727204605.52428: waiting for pending results... 51385 1727204605.52781: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 51385 1727204605.52949: in run() - task 0affcd87-79f5-6b1f-5706-000000000066 51385 1727204605.52966: variable 'ansible_search_path' from source: unknown 51385 1727204605.52974: variable 'ansible_search_path' from source: unknown 51385 1727204605.53005: calling self._execute() 51385 1727204605.53085: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.53088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.53098: variable 'omit' from source: magic vars 51385 1727204605.53392: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.53402: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.53408: variable 'omit' from source: magic vars 51385 1727204605.53444: variable 'omit' from source: magic vars 51385 1727204605.53520: variable 'network_provider' from source: set_fact 51385 1727204605.53533: variable 'omit' from source: magic vars 51385 1727204605.53572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204605.53601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204605.53618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204605.53631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204605.53641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204605.53671: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204605.53674: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.53677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.53743: Set connection var ansible_pipelining to False 51385 1727204605.53747: Set connection var ansible_shell_type to sh 51385 1727204605.53754: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204605.53761: Set connection var ansible_timeout to 10 51385 1727204605.53767: Set connection var ansible_connection to ssh 51385 1727204605.53773: Set connection var ansible_shell_executable to /bin/sh 51385 1727204605.53793: variable 'ansible_shell_executable' from source: unknown 51385 1727204605.53796: variable 'ansible_connection' from source: unknown 51385 1727204605.53799: variable 'ansible_module_compression' from source: unknown 51385 1727204605.53801: variable 'ansible_shell_type' from source: unknown 51385 1727204605.53804: variable 'ansible_shell_executable' from source: unknown 51385 1727204605.53806: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.53810: variable 'ansible_pipelining' from source: unknown 51385 1727204605.53812: variable 'ansible_timeout' from source: unknown 51385 1727204605.53815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.53917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204605.53930: variable 'omit' from source: magic vars 51385 1727204605.53933: starting attempt loop 51385 1727204605.53936: running the handler 51385 1727204605.53976: handler run complete 51385 1727204605.53987: attempt loop complete, returning result 51385 1727204605.53990: _execute() done 51385 1727204605.53993: dumping result to json 51385 1727204605.53996: done dumping result, returning 51385 1727204605.54001: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-6b1f-5706-000000000066] 51385 1727204605.54008: sending task result for task 0affcd87-79f5-6b1f-5706-000000000066 51385 1727204605.54093: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000066 51385 1727204605.54096: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 51385 1727204605.54157: no more pending results, returning what we have 51385 1727204605.54160: results queue empty 51385 1727204605.54162: checking for any_errors_fatal 51385 1727204605.54173: done checking for any_errors_fatal 51385 1727204605.54173: checking for max_fail_percentage 51385 1727204605.54175: done checking for max_fail_percentage 51385 1727204605.54176: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.54177: done checking to see if all hosts have failed 51385 1727204605.54178: getting the remaining hosts for this loop 51385 1727204605.54179: done getting the remaining hosts for this loop 51385 1727204605.54183: getting the next task for host managed-node1 51385 1727204605.54188: done getting next task for host managed-node1 51385 1727204605.54191: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51385 1727204605.54193: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.54209: getting variables 51385 1727204605.54211: in VariableManager get_vars() 51385 1727204605.54251: Calling all_inventory to load vars for managed-node1 51385 1727204605.54253: Calling groups_inventory to load vars for managed-node1 51385 1727204605.54256: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.54266: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.54269: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.54271: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.55762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.57669: done with get_vars() 51385 1727204605.57695: done getting variables 51385 1727204605.57756: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.056) 0:00:23.981 ***** 51385 1727204605.57796: entering _queue_task() for managed-node1/fail 51385 1727204605.58143: worker is 1 (out of 1 available) 51385 1727204605.58156: exiting _queue_task() for managed-node1/fail 51385 1727204605.58174: done queuing things up, now waiting for results queue to drain 51385 1727204605.58176: waiting for pending results... 51385 1727204605.58483: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51385 1727204605.58587: in run() - task 0affcd87-79f5-6b1f-5706-000000000067 51385 1727204605.58602: variable 'ansible_search_path' from source: unknown 51385 1727204605.58606: variable 'ansible_search_path' from source: unknown 51385 1727204605.58633: calling self._execute() 51385 1727204605.58711: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.58715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.58725: variable 'omit' from source: magic vars 51385 1727204605.59019: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.59031: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.59119: variable 'network_state' from source: role '' defaults 51385 1727204605.59126: Evaluated conditional (network_state != {}): False 51385 1727204605.59132: when evaluation is False, skipping this task 51385 1727204605.59134: _execute() done 51385 1727204605.59137: dumping result to json 51385 1727204605.59139: done dumping result, returning 51385 1727204605.59151: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-6b1f-5706-000000000067] 51385 1727204605.59162: sending task result for task 0affcd87-79f5-6b1f-5706-000000000067 51385 1727204605.59253: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000067 51385 1727204605.59256: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204605.59310: no more pending results, returning what we have 51385 1727204605.59313: results queue empty 51385 1727204605.59314: checking for any_errors_fatal 51385 1727204605.59322: done checking for any_errors_fatal 51385 1727204605.59323: checking for max_fail_percentage 51385 1727204605.59324: done checking for max_fail_percentage 51385 1727204605.59325: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.59326: done checking to see if all hosts have failed 51385 1727204605.59327: getting the remaining hosts for this loop 51385 1727204605.59328: done getting the remaining hosts for this loop 51385 1727204605.59332: getting the next task for host managed-node1 51385 1727204605.59338: done getting next task for host managed-node1 51385 1727204605.59342: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51385 1727204605.59345: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.59367: getting variables 51385 1727204605.59369: in VariableManager get_vars() 51385 1727204605.59404: Calling all_inventory to load vars for managed-node1 51385 1727204605.59407: Calling groups_inventory to load vars for managed-node1 51385 1727204605.59409: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.59418: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.59420: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.59422: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.60362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.62186: done with get_vars() 51385 1727204605.62213: done getting variables 51385 1727204605.62274: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.045) 0:00:24.027 ***** 51385 1727204605.62315: entering _queue_task() for managed-node1/fail 51385 1727204605.62642: worker is 1 (out of 1 available) 51385 1727204605.62655: exiting _queue_task() for managed-node1/fail 51385 1727204605.62670: done queuing things up, now waiting for results queue to drain 51385 1727204605.62671: waiting for pending results... 51385 1727204605.62979: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51385 1727204605.63103: in run() - task 0affcd87-79f5-6b1f-5706-000000000068 51385 1727204605.63119: variable 'ansible_search_path' from source: unknown 51385 1727204605.63123: variable 'ansible_search_path' from source: unknown 51385 1727204605.63167: calling self._execute() 51385 1727204605.63272: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.63282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.63292: variable 'omit' from source: magic vars 51385 1727204605.63686: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.63699: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.63839: variable 'network_state' from source: role '' defaults 51385 1727204605.63848: Evaluated conditional (network_state != {}): False 51385 1727204605.63852: when evaluation is False, skipping this task 51385 1727204605.63855: _execute() done 51385 1727204605.63858: dumping result to json 51385 1727204605.63865: done dumping result, returning 51385 1727204605.63875: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-6b1f-5706-000000000068] 51385 1727204605.63882: sending task result for task 0affcd87-79f5-6b1f-5706-000000000068 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204605.64024: no more pending results, returning what we have 51385 1727204605.64030: results queue empty 51385 1727204605.64031: checking for any_errors_fatal 51385 1727204605.64043: done checking for any_errors_fatal 51385 1727204605.64044: checking for max_fail_percentage 51385 1727204605.64046: done checking for max_fail_percentage 51385 1727204605.64047: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.64048: done checking to see if all hosts have failed 51385 1727204605.64049: getting the remaining hosts for this loop 51385 1727204605.64051: done getting the remaining hosts for this loop 51385 1727204605.64055: getting the next task for host managed-node1 51385 1727204605.64061: done getting next task for host managed-node1 51385 1727204605.64067: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51385 1727204605.64071: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.64097: getting variables 51385 1727204605.64099: in VariableManager get_vars() 51385 1727204605.64141: Calling all_inventory to load vars for managed-node1 51385 1727204605.64144: Calling groups_inventory to load vars for managed-node1 51385 1727204605.64146: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.64161: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.64165: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.64170: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.64689: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000068 51385 1727204605.64693: WORKER PROCESS EXITING 51385 1727204605.66141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.67839: done with get_vars() 51385 1727204605.67866: done getting variables 51385 1727204605.67924: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.056) 0:00:24.083 ***** 51385 1727204605.67966: entering _queue_task() for managed-node1/fail 51385 1727204605.68307: worker is 1 (out of 1 available) 51385 1727204605.68320: exiting _queue_task() for managed-node1/fail 51385 1727204605.68332: done queuing things up, now waiting for results queue to drain 51385 1727204605.68334: waiting for pending results... 51385 1727204605.68635: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51385 1727204605.68782: in run() - task 0affcd87-79f5-6b1f-5706-000000000069 51385 1727204605.68800: variable 'ansible_search_path' from source: unknown 51385 1727204605.68807: variable 'ansible_search_path' from source: unknown 51385 1727204605.68846: calling self._execute() 51385 1727204605.68947: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.68958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.68978: variable 'omit' from source: magic vars 51385 1727204605.69370: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.69388: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.69572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204605.72057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204605.72139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204605.72191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204605.72233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204605.72274: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204605.72361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.72414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.72448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.72503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.72523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204605.72634: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.72656: Evaluated conditional (ansible_distribution_major_version | int > 9): False 51385 1727204605.72669: when evaluation is False, skipping this task 51385 1727204605.72677: _execute() done 51385 1727204605.72684: dumping result to json 51385 1727204605.72693: done dumping result, returning 51385 1727204605.72706: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-6b1f-5706-000000000069] 51385 1727204605.72718: sending task result for task 0affcd87-79f5-6b1f-5706-000000000069 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 51385 1727204605.72891: no more pending results, returning what we have 51385 1727204605.72895: results queue empty 51385 1727204605.72897: checking for any_errors_fatal 51385 1727204605.72904: done checking for any_errors_fatal 51385 1727204605.72905: checking for max_fail_percentage 51385 1727204605.72907: done checking for max_fail_percentage 51385 1727204605.72908: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.72909: done checking to see if all hosts have failed 51385 1727204605.72910: getting the remaining hosts for this loop 51385 1727204605.72912: done getting the remaining hosts for this loop 51385 1727204605.72917: getting the next task for host managed-node1 51385 1727204605.72924: done getting next task for host managed-node1 51385 1727204605.72929: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51385 1727204605.72932: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.72952: getting variables 51385 1727204605.72954: in VariableManager get_vars() 51385 1727204605.73004: Calling all_inventory to load vars for managed-node1 51385 1727204605.73007: Calling groups_inventory to load vars for managed-node1 51385 1727204605.73010: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.73022: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.73025: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.73028: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.74483: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000069 51385 1727204605.74487: WORKER PROCESS EXITING 51385 1727204605.75535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.77651: done with get_vars() 51385 1727204605.77682: done getting variables 51385 1727204605.77741: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.098) 0:00:24.181 ***** 51385 1727204605.77777: entering _queue_task() for managed-node1/dnf 51385 1727204605.78092: worker is 1 (out of 1 available) 51385 1727204605.78104: exiting _queue_task() for managed-node1/dnf 51385 1727204605.78117: done queuing things up, now waiting for results queue to drain 51385 1727204605.78118: waiting for pending results... 51385 1727204605.78969: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51385 1727204605.79109: in run() - task 0affcd87-79f5-6b1f-5706-00000000006a 51385 1727204605.79128: variable 'ansible_search_path' from source: unknown 51385 1727204605.79135: variable 'ansible_search_path' from source: unknown 51385 1727204605.79183: calling self._execute() 51385 1727204605.79290: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.79303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.79318: variable 'omit' from source: magic vars 51385 1727204605.79705: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.79722: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.79937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204605.84391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204605.84473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204605.84516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204605.84563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204605.84595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204605.84723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.84892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.84922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.85078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.85098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204605.85338: variable 'ansible_distribution' from source: facts 51385 1727204605.85348: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.85401: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 51385 1727204605.85673: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204605.85974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.86078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.86108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.86203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.86286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204605.86331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.86406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.86516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.86562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.86619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204605.86663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.86768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.86844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.86947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.86973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204605.87398: variable 'network_connections' from source: task vars 51385 1727204605.87417: variable 'interface' from source: play vars 51385 1727204605.87532: variable 'interface' from source: play vars 51385 1727204605.87594: variable 'vlan_interface' from source: play vars 51385 1727204605.87799: variable 'vlan_interface' from source: play vars 51385 1727204605.88020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204605.88414: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204605.88679: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204605.88721: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204605.88773: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204605.88900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204605.88987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204605.89024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.89097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204605.89197: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204605.89477: variable 'network_connections' from source: task vars 51385 1727204605.89488: variable 'interface' from source: play vars 51385 1727204605.89563: variable 'interface' from source: play vars 51385 1727204605.89580: variable 'vlan_interface' from source: play vars 51385 1727204605.89652: variable 'vlan_interface' from source: play vars 51385 1727204605.89689: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204605.89697: when evaluation is False, skipping this task 51385 1727204605.89705: _execute() done 51385 1727204605.89712: dumping result to json 51385 1727204605.89724: done dumping result, returning 51385 1727204605.89736: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000006a] 51385 1727204605.89746: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006a skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204605.89913: no more pending results, returning what we have 51385 1727204605.89918: results queue empty 51385 1727204605.89919: checking for any_errors_fatal 51385 1727204605.89927: done checking for any_errors_fatal 51385 1727204605.89927: checking for max_fail_percentage 51385 1727204605.89929: done checking for max_fail_percentage 51385 1727204605.89930: checking to see if all hosts have failed and the running result is not ok 51385 1727204605.89931: done checking to see if all hosts have failed 51385 1727204605.89932: getting the remaining hosts for this loop 51385 1727204605.89934: done getting the remaining hosts for this loop 51385 1727204605.89938: getting the next task for host managed-node1 51385 1727204605.89945: done getting next task for host managed-node1 51385 1727204605.89949: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51385 1727204605.89953: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204605.89977: getting variables 51385 1727204605.89979: in VariableManager get_vars() 51385 1727204605.90024: Calling all_inventory to load vars for managed-node1 51385 1727204605.90027: Calling groups_inventory to load vars for managed-node1 51385 1727204605.90030: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204605.90041: Calling all_plugins_play to load vars for managed-node1 51385 1727204605.90044: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204605.90047: Calling groups_plugins_play to load vars for managed-node1 51385 1727204605.91338: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006a 51385 1727204605.91342: WORKER PROCESS EXITING 51385 1727204605.92028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204605.93776: done with get_vars() 51385 1727204605.93804: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51385 1727204605.93887: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.161) 0:00:24.343 ***** 51385 1727204605.93921: entering _queue_task() for managed-node1/yum 51385 1727204605.94250: worker is 1 (out of 1 available) 51385 1727204605.94268: exiting _queue_task() for managed-node1/yum 51385 1727204605.94280: done queuing things up, now waiting for results queue to drain 51385 1727204605.94281: waiting for pending results... 51385 1727204605.94571: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51385 1727204605.94720: in run() - task 0affcd87-79f5-6b1f-5706-00000000006b 51385 1727204605.94741: variable 'ansible_search_path' from source: unknown 51385 1727204605.94748: variable 'ansible_search_path' from source: unknown 51385 1727204605.94793: calling self._execute() 51385 1727204605.94899: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204605.94910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204605.94923: variable 'omit' from source: magic vars 51385 1727204605.95312: variable 'ansible_distribution_major_version' from source: facts 51385 1727204605.95328: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204605.95517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204605.98947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204605.99055: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204605.99187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204605.99373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204605.99486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204605.99625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204605.99811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204605.99843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204605.99956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204605.99996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.00109: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.00131: Evaluated conditional (ansible_distribution_major_version | int < 8): False 51385 1727204606.00139: when evaluation is False, skipping this task 51385 1727204606.00146: _execute() done 51385 1727204606.00154: dumping result to json 51385 1727204606.00168: done dumping result, returning 51385 1727204606.00182: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000006b] 51385 1727204606.00197: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006b skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 51385 1727204606.00365: no more pending results, returning what we have 51385 1727204606.00371: results queue empty 51385 1727204606.00372: checking for any_errors_fatal 51385 1727204606.00378: done checking for any_errors_fatal 51385 1727204606.00379: checking for max_fail_percentage 51385 1727204606.00381: done checking for max_fail_percentage 51385 1727204606.00382: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.00383: done checking to see if all hosts have failed 51385 1727204606.00384: getting the remaining hosts for this loop 51385 1727204606.00386: done getting the remaining hosts for this loop 51385 1727204606.00390: getting the next task for host managed-node1 51385 1727204606.00398: done getting next task for host managed-node1 51385 1727204606.00402: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51385 1727204606.00405: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.00425: getting variables 51385 1727204606.00427: in VariableManager get_vars() 51385 1727204606.00476: Calling all_inventory to load vars for managed-node1 51385 1727204606.00479: Calling groups_inventory to load vars for managed-node1 51385 1727204606.00482: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.00492: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.00495: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.00498: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.02086: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006b 51385 1727204606.02089: WORKER PROCESS EXITING 51385 1727204606.02318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.04206: done with get_vars() 51385 1727204606.04230: done getting variables 51385 1727204606.04298: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.104) 0:00:24.447 ***** 51385 1727204606.04334: entering _queue_task() for managed-node1/fail 51385 1727204606.04681: worker is 1 (out of 1 available) 51385 1727204606.04697: exiting _queue_task() for managed-node1/fail 51385 1727204606.04710: done queuing things up, now waiting for results queue to drain 51385 1727204606.04712: waiting for pending results... 51385 1727204606.05029: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51385 1727204606.05183: in run() - task 0affcd87-79f5-6b1f-5706-00000000006c 51385 1727204606.05202: variable 'ansible_search_path' from source: unknown 51385 1727204606.05210: variable 'ansible_search_path' from source: unknown 51385 1727204606.05252: calling self._execute() 51385 1727204606.05362: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.05382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.05400: variable 'omit' from source: magic vars 51385 1727204606.05772: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.05789: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.05911: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.06121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204606.08606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204606.08684: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204606.08726: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204606.08772: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204606.08803: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204606.08894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.08941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.08978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.09026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.09047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.09109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.09138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.09173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.09222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.09243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.09293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.09326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.09355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.09402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.09426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.09618: variable 'network_connections' from source: task vars 51385 1727204606.09642: variable 'interface' from source: play vars 51385 1727204606.09724: variable 'interface' from source: play vars 51385 1727204606.09740: variable 'vlan_interface' from source: play vars 51385 1727204606.09811: variable 'vlan_interface' from source: play vars 51385 1727204606.09893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204606.10074: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204606.10117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204606.10150: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204606.10192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204606.10237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204606.10269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204606.10304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.10332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204606.10390: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204606.10645: variable 'network_connections' from source: task vars 51385 1727204606.10654: variable 'interface' from source: play vars 51385 1727204606.10726: variable 'interface' from source: play vars 51385 1727204606.10737: variable 'vlan_interface' from source: play vars 51385 1727204606.10802: variable 'vlan_interface' from source: play vars 51385 1727204606.10833: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204606.10842: when evaluation is False, skipping this task 51385 1727204606.10850: _execute() done 51385 1727204606.10857: dumping result to json 51385 1727204606.10870: done dumping result, returning 51385 1727204606.10881: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-00000000006c] 51385 1727204606.10898: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204606.11046: no more pending results, returning what we have 51385 1727204606.11050: results queue empty 51385 1727204606.11051: checking for any_errors_fatal 51385 1727204606.11058: done checking for any_errors_fatal 51385 1727204606.11061: checking for max_fail_percentage 51385 1727204606.11063: done checking for max_fail_percentage 51385 1727204606.11066: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.11067: done checking to see if all hosts have failed 51385 1727204606.11067: getting the remaining hosts for this loop 51385 1727204606.11069: done getting the remaining hosts for this loop 51385 1727204606.11074: getting the next task for host managed-node1 51385 1727204606.11081: done getting next task for host managed-node1 51385 1727204606.11086: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51385 1727204606.11089: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.11109: getting variables 51385 1727204606.11111: in VariableManager get_vars() 51385 1727204606.11154: Calling all_inventory to load vars for managed-node1 51385 1727204606.11158: Calling groups_inventory to load vars for managed-node1 51385 1727204606.11163: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.11175: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.11178: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.11181: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.12484: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006c 51385 1727204606.12488: WORKER PROCESS EXITING 51385 1727204606.12979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.14716: done with get_vars() 51385 1727204606.14746: done getting variables 51385 1727204606.14812: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.105) 0:00:24.552 ***** 51385 1727204606.14850: entering _queue_task() for managed-node1/package 51385 1727204606.15200: worker is 1 (out of 1 available) 51385 1727204606.15213: exiting _queue_task() for managed-node1/package 51385 1727204606.15225: done queuing things up, now waiting for results queue to drain 51385 1727204606.15226: waiting for pending results... 51385 1727204606.15530: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 51385 1727204606.15684: in run() - task 0affcd87-79f5-6b1f-5706-00000000006d 51385 1727204606.15701: variable 'ansible_search_path' from source: unknown 51385 1727204606.15708: variable 'ansible_search_path' from source: unknown 51385 1727204606.15751: calling self._execute() 51385 1727204606.15856: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.15876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.15896: variable 'omit' from source: magic vars 51385 1727204606.16299: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.16318: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.16528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204606.16804: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204606.16856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204606.16904: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204606.16986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204606.17103: variable 'network_packages' from source: role '' defaults 51385 1727204606.17219: variable '__network_provider_setup' from source: role '' defaults 51385 1727204606.17235: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204606.17315: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204606.17329: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204606.17394: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204606.17594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204606.22452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204606.22609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204606.22653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204606.22699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204606.22729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204606.22816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.22854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.22890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.22939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.22957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.23013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.23044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.23077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.23119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.23141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.23398: variable '__network_packages_default_gobject_packages' from source: role '' defaults 51385 1727204606.23522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.23550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.23587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.23628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.23646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.23750: variable 'ansible_python' from source: facts 51385 1727204606.23790: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 51385 1727204606.23880: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204606.23956: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204606.24455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.24489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.24518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.24571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.24593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.24650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.24695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.24722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.24774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.24793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.24953: variable 'network_connections' from source: task vars 51385 1727204606.24970: variable 'interface' from source: play vars 51385 1727204606.25091: variable 'interface' from source: play vars 51385 1727204606.25107: variable 'vlan_interface' from source: play vars 51385 1727204606.25224: variable 'vlan_interface' from source: play vars 51385 1727204606.25303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204606.25338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204606.25379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.25416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204606.25479: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.25793: variable 'network_connections' from source: task vars 51385 1727204606.25802: variable 'interface' from source: play vars 51385 1727204606.25916: variable 'interface' from source: play vars 51385 1727204606.25931: variable 'vlan_interface' from source: play vars 51385 1727204606.26043: variable 'vlan_interface' from source: play vars 51385 1727204606.26088: variable '__network_packages_default_wireless' from source: role '' defaults 51385 1727204606.26175: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.26509: variable 'network_connections' from source: task vars 51385 1727204606.26521: variable 'interface' from source: play vars 51385 1727204606.26590: variable 'interface' from source: play vars 51385 1727204606.26602: variable 'vlan_interface' from source: play vars 51385 1727204606.26676: variable 'vlan_interface' from source: play vars 51385 1727204606.26702: variable '__network_packages_default_team' from source: role '' defaults 51385 1727204606.26792: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204606.27126: variable 'network_connections' from source: task vars 51385 1727204606.27135: variable 'interface' from source: play vars 51385 1727204606.27209: variable 'interface' from source: play vars 51385 1727204606.27221: variable 'vlan_interface' from source: play vars 51385 1727204606.27295: variable 'vlan_interface' from source: play vars 51385 1727204606.27350: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204606.27422: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204606.27433: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204606.27505: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204606.27739: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 51385 1727204606.28261: variable 'network_connections' from source: task vars 51385 1727204606.28273: variable 'interface' from source: play vars 51385 1727204606.28334: variable 'interface' from source: play vars 51385 1727204606.28347: variable 'vlan_interface' from source: play vars 51385 1727204606.28414: variable 'vlan_interface' from source: play vars 51385 1727204606.28426: variable 'ansible_distribution' from source: facts 51385 1727204606.28434: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.28444: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.28467: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 51385 1727204606.28638: variable 'ansible_distribution' from source: facts 51385 1727204606.28648: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.28657: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.28677: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 51385 1727204606.28847: variable 'ansible_distribution' from source: facts 51385 1727204606.28857: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.28874: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.28918: variable 'network_provider' from source: set_fact 51385 1727204606.28938: variable 'ansible_facts' from source: unknown 51385 1727204606.29778: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 51385 1727204606.29785: when evaluation is False, skipping this task 51385 1727204606.29791: _execute() done 51385 1727204606.29795: dumping result to json 51385 1727204606.29801: done dumping result, returning 51385 1727204606.29810: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-6b1f-5706-00000000006d] 51385 1727204606.29818: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006d skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 51385 1727204606.29956: no more pending results, returning what we have 51385 1727204606.29963: results queue empty 51385 1727204606.29966: checking for any_errors_fatal 51385 1727204606.29974: done checking for any_errors_fatal 51385 1727204606.29974: checking for max_fail_percentage 51385 1727204606.29976: done checking for max_fail_percentage 51385 1727204606.29977: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.29978: done checking to see if all hosts have failed 51385 1727204606.29979: getting the remaining hosts for this loop 51385 1727204606.29981: done getting the remaining hosts for this loop 51385 1727204606.29985: getting the next task for host managed-node1 51385 1727204606.29992: done getting next task for host managed-node1 51385 1727204606.29995: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51385 1727204606.29998: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.30018: getting variables 51385 1727204606.30020: in VariableManager get_vars() 51385 1727204606.30078: Calling all_inventory to load vars for managed-node1 51385 1727204606.30081: Calling groups_inventory to load vars for managed-node1 51385 1727204606.30084: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.30095: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.30099: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.30102: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.31873: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006d 51385 1727204606.31877: WORKER PROCESS EXITING 51385 1727204606.33147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.36757: done with get_vars() 51385 1727204606.37805: done getting variables 51385 1727204606.37870: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.230) 0:00:24.782 ***** 51385 1727204606.37906: entering _queue_task() for managed-node1/package 51385 1727204606.38237: worker is 1 (out of 1 available) 51385 1727204606.38251: exiting _queue_task() for managed-node1/package 51385 1727204606.38269: done queuing things up, now waiting for results queue to drain 51385 1727204606.38270: waiting for pending results... 51385 1727204606.39114: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51385 1727204606.39457: in run() - task 0affcd87-79f5-6b1f-5706-00000000006e 51385 1727204606.39481: variable 'ansible_search_path' from source: unknown 51385 1727204606.39541: variable 'ansible_search_path' from source: unknown 51385 1727204606.39588: calling self._execute() 51385 1727204606.39741: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.39823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.39838: variable 'omit' from source: magic vars 51385 1727204606.40247: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.40269: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.40407: variable 'network_state' from source: role '' defaults 51385 1727204606.40426: Evaluated conditional (network_state != {}): False 51385 1727204606.40434: when evaluation is False, skipping this task 51385 1727204606.40443: _execute() done 51385 1727204606.40450: dumping result to json 51385 1727204606.40457: done dumping result, returning 51385 1727204606.40470: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-6b1f-5706-00000000006e] 51385 1727204606.40483: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006e 51385 1727204606.40603: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006e 51385 1727204606.40611: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204606.40669: no more pending results, returning what we have 51385 1727204606.40673: results queue empty 51385 1727204606.40674: checking for any_errors_fatal 51385 1727204606.40682: done checking for any_errors_fatal 51385 1727204606.40683: checking for max_fail_percentage 51385 1727204606.40685: done checking for max_fail_percentage 51385 1727204606.40686: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.40686: done checking to see if all hosts have failed 51385 1727204606.40687: getting the remaining hosts for this loop 51385 1727204606.40689: done getting the remaining hosts for this loop 51385 1727204606.40692: getting the next task for host managed-node1 51385 1727204606.40699: done getting next task for host managed-node1 51385 1727204606.40703: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51385 1727204606.40706: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.40725: getting variables 51385 1727204606.40727: in VariableManager get_vars() 51385 1727204606.40772: Calling all_inventory to load vars for managed-node1 51385 1727204606.40775: Calling groups_inventory to load vars for managed-node1 51385 1727204606.40777: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.40789: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.40791: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.40794: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.43224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.47288: done with get_vars() 51385 1727204606.47329: done getting variables 51385 1727204606.47402: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.095) 0:00:24.878 ***** 51385 1727204606.47441: entering _queue_task() for managed-node1/package 51385 1727204606.48058: worker is 1 (out of 1 available) 51385 1727204606.48075: exiting _queue_task() for managed-node1/package 51385 1727204606.48088: done queuing things up, now waiting for results queue to drain 51385 1727204606.48089: waiting for pending results... 51385 1727204606.48392: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51385 1727204606.48544: in run() - task 0affcd87-79f5-6b1f-5706-00000000006f 51385 1727204606.48570: variable 'ansible_search_path' from source: unknown 51385 1727204606.48579: variable 'ansible_search_path' from source: unknown 51385 1727204606.48626: calling self._execute() 51385 1727204606.48735: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.48746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.48762: variable 'omit' from source: magic vars 51385 1727204606.49369: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.49388: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.49629: variable 'network_state' from source: role '' defaults 51385 1727204606.49645: Evaluated conditional (network_state != {}): False 51385 1727204606.49652: when evaluation is False, skipping this task 51385 1727204606.49666: _execute() done 51385 1727204606.49686: dumping result to json 51385 1727204606.49795: done dumping result, returning 51385 1727204606.49809: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-6b1f-5706-00000000006f] 51385 1727204606.49821: sending task result for task 0affcd87-79f5-6b1f-5706-00000000006f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204606.49984: no more pending results, returning what we have 51385 1727204606.49988: results queue empty 51385 1727204606.49989: checking for any_errors_fatal 51385 1727204606.49994: done checking for any_errors_fatal 51385 1727204606.49995: checking for max_fail_percentage 51385 1727204606.49997: done checking for max_fail_percentage 51385 1727204606.49998: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.49999: done checking to see if all hosts have failed 51385 1727204606.50000: getting the remaining hosts for this loop 51385 1727204606.50001: done getting the remaining hosts for this loop 51385 1727204606.50005: getting the next task for host managed-node1 51385 1727204606.50012: done getting next task for host managed-node1 51385 1727204606.50016: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51385 1727204606.50020: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.50040: getting variables 51385 1727204606.50042: in VariableManager get_vars() 51385 1727204606.50090: Calling all_inventory to load vars for managed-node1 51385 1727204606.50094: Calling groups_inventory to load vars for managed-node1 51385 1727204606.50096: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.50110: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.50114: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.50117: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.51423: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000006f 51385 1727204606.51427: WORKER PROCESS EXITING 51385 1727204606.53031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.55752: done with get_vars() 51385 1727204606.55790: done getting variables 51385 1727204606.55854: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.084) 0:00:24.962 ***** 51385 1727204606.55896: entering _queue_task() for managed-node1/service 51385 1727204606.56224: worker is 1 (out of 1 available) 51385 1727204606.56236: exiting _queue_task() for managed-node1/service 51385 1727204606.56251: done queuing things up, now waiting for results queue to drain 51385 1727204606.56252: waiting for pending results... 51385 1727204606.56555: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51385 1727204606.56732: in run() - task 0affcd87-79f5-6b1f-5706-000000000070 51385 1727204606.56754: variable 'ansible_search_path' from source: unknown 51385 1727204606.56768: variable 'ansible_search_path' from source: unknown 51385 1727204606.56926: calling self._execute() 51385 1727204606.57146: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.57158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.57179: variable 'omit' from source: magic vars 51385 1727204606.57973: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.57991: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.58262: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.58715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204606.63916: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204606.63996: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204606.64151: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204606.64197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204606.64354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204606.64446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.64497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.64525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.64577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.64595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.64642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.64680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.64707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.64749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.64776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.64818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.64846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.64883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.64925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.64942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.65142: variable 'network_connections' from source: task vars 51385 1727204606.65162: variable 'interface' from source: play vars 51385 1727204606.65243: variable 'interface' from source: play vars 51385 1727204606.65257: variable 'vlan_interface' from source: play vars 51385 1727204606.65333: variable 'vlan_interface' from source: play vars 51385 1727204606.65408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204606.65585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204606.65627: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204606.65670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204606.65702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204606.65752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204606.65783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204606.65812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.65843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204606.65905: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204606.66171: variable 'network_connections' from source: task vars 51385 1727204606.66186: variable 'interface' from source: play vars 51385 1727204606.66249: variable 'interface' from source: play vars 51385 1727204606.66267: variable 'vlan_interface' from source: play vars 51385 1727204606.66331: variable 'vlan_interface' from source: play vars 51385 1727204606.66362: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 51385 1727204606.66373: when evaluation is False, skipping this task 51385 1727204606.66381: _execute() done 51385 1727204606.66388: dumping result to json 51385 1727204606.66400: done dumping result, returning 51385 1727204606.66413: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-6b1f-5706-000000000070] 51385 1727204606.66432: sending task result for task 0affcd87-79f5-6b1f-5706-000000000070 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 51385 1727204606.66603: no more pending results, returning what we have 51385 1727204606.66607: results queue empty 51385 1727204606.66608: checking for any_errors_fatal 51385 1727204606.66617: done checking for any_errors_fatal 51385 1727204606.66617: checking for max_fail_percentage 51385 1727204606.66619: done checking for max_fail_percentage 51385 1727204606.66620: checking to see if all hosts have failed and the running result is not ok 51385 1727204606.66621: done checking to see if all hosts have failed 51385 1727204606.66622: getting the remaining hosts for this loop 51385 1727204606.66623: done getting the remaining hosts for this loop 51385 1727204606.66627: getting the next task for host managed-node1 51385 1727204606.66634: done getting next task for host managed-node1 51385 1727204606.66639: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51385 1727204606.66642: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204606.66674: getting variables 51385 1727204606.66677: in VariableManager get_vars() 51385 1727204606.66723: Calling all_inventory to load vars for managed-node1 51385 1727204606.66726: Calling groups_inventory to load vars for managed-node1 51385 1727204606.66729: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204606.66740: Calling all_plugins_play to load vars for managed-node1 51385 1727204606.66742: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204606.66745: Calling groups_plugins_play to load vars for managed-node1 51385 1727204606.67896: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000070 51385 1727204606.67900: WORKER PROCESS EXITING 51385 1727204606.69134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204606.72177: done with get_vars() 51385 1727204606.72211: done getting variables 51385 1727204606.72279: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.164) 0:00:25.127 ***** 51385 1727204606.72314: entering _queue_task() for managed-node1/service 51385 1727204606.72651: worker is 1 (out of 1 available) 51385 1727204606.72668: exiting _queue_task() for managed-node1/service 51385 1727204606.72681: done queuing things up, now waiting for results queue to drain 51385 1727204606.72682: waiting for pending results... 51385 1727204606.72985: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51385 1727204606.73134: in run() - task 0affcd87-79f5-6b1f-5706-000000000071 51385 1727204606.73157: variable 'ansible_search_path' from source: unknown 51385 1727204606.73172: variable 'ansible_search_path' from source: unknown 51385 1727204606.73212: calling self._execute() 51385 1727204606.73318: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.73330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.73349: variable 'omit' from source: magic vars 51385 1727204606.73779: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.73800: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204606.73972: variable 'network_provider' from source: set_fact 51385 1727204606.73999: variable 'network_state' from source: role '' defaults 51385 1727204606.74021: Evaluated conditional (network_provider == "nm" or network_state != {}): True 51385 1727204606.74031: variable 'omit' from source: magic vars 51385 1727204606.74092: variable 'omit' from source: magic vars 51385 1727204606.74129: variable 'network_service_name' from source: role '' defaults 51385 1727204606.74202: variable 'network_service_name' from source: role '' defaults 51385 1727204606.74319: variable '__network_provider_setup' from source: role '' defaults 51385 1727204606.74335: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204606.74417: variable '__network_service_name_default_nm' from source: role '' defaults 51385 1727204606.74430: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204606.74509: variable '__network_packages_default_nm' from source: role '' defaults 51385 1727204606.74748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204606.77486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204606.77565: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204606.77608: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204606.77651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204606.77688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204606.77776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.77824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.77865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.77911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.77929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.77986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.78013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.78042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.78093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.78112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.78355: variable '__network_packages_default_gobject_packages' from source: role '' defaults 51385 1727204606.78493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.78523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.78552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.78603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.78624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.78729: variable 'ansible_python' from source: facts 51385 1727204606.78756: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 51385 1727204606.78853: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204606.78944: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204606.79086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.79117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.79149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.79203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.79223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.79284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204606.79319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204606.79345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.79396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204606.79414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204606.79570: variable 'network_connections' from source: task vars 51385 1727204606.79583: variable 'interface' from source: play vars 51385 1727204606.79674: variable 'interface' from source: play vars 51385 1727204606.79691: variable 'vlan_interface' from source: play vars 51385 1727204606.79774: variable 'vlan_interface' from source: play vars 51385 1727204606.79890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204606.80088: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204606.80146: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204606.80197: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204606.80241: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204606.80316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204606.80351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204606.80396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204606.80433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204606.80498: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.80820: variable 'network_connections' from source: task vars 51385 1727204606.80831: variable 'interface' from source: play vars 51385 1727204606.80921: variable 'interface' from source: play vars 51385 1727204606.80937: variable 'vlan_interface' from source: play vars 51385 1727204606.81019: variable 'vlan_interface' from source: play vars 51385 1727204606.81054: variable '__network_packages_default_wireless' from source: role '' defaults 51385 1727204606.81144: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204606.81471: variable 'network_connections' from source: task vars 51385 1727204606.81482: variable 'interface' from source: play vars 51385 1727204606.81557: variable 'interface' from source: play vars 51385 1727204606.81575: variable 'vlan_interface' from source: play vars 51385 1727204606.81649: variable 'vlan_interface' from source: play vars 51385 1727204606.81684: variable '__network_packages_default_team' from source: role '' defaults 51385 1727204606.81769: variable '__network_team_connections_defined' from source: role '' defaults 51385 1727204606.82086: variable 'network_connections' from source: task vars 51385 1727204606.82096: variable 'interface' from source: play vars 51385 1727204606.82176: variable 'interface' from source: play vars 51385 1727204606.82190: variable 'vlan_interface' from source: play vars 51385 1727204606.82268: variable 'vlan_interface' from source: play vars 51385 1727204606.82327: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204606.82392: variable '__network_service_name_default_initscripts' from source: role '' defaults 51385 1727204606.82404: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204606.82474: variable '__network_packages_default_initscripts' from source: role '' defaults 51385 1727204606.82706: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 51385 1727204606.83233: variable 'network_connections' from source: task vars 51385 1727204606.83244: variable 'interface' from source: play vars 51385 1727204606.83303: variable 'interface' from source: play vars 51385 1727204606.83313: variable 'vlan_interface' from source: play vars 51385 1727204606.83368: variable 'vlan_interface' from source: play vars 51385 1727204606.83380: variable 'ansible_distribution' from source: facts 51385 1727204606.83386: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.83394: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.83414: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 51385 1727204606.83596: variable 'ansible_distribution' from source: facts 51385 1727204606.83605: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.83613: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.83633: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 51385 1727204606.83820: variable 'ansible_distribution' from source: facts 51385 1727204606.83829: variable '__network_rh_distros' from source: role '' defaults 51385 1727204606.83842: variable 'ansible_distribution_major_version' from source: facts 51385 1727204606.83886: variable 'network_provider' from source: set_fact 51385 1727204606.83912: variable 'omit' from source: magic vars 51385 1727204606.83943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204606.83981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204606.84005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204606.84082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204606.84099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204606.84131: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204606.84175: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.84287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.84502: Set connection var ansible_pipelining to False 51385 1727204606.84510: Set connection var ansible_shell_type to sh 51385 1727204606.84526: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204606.84538: Set connection var ansible_timeout to 10 51385 1727204606.84545: Set connection var ansible_connection to ssh 51385 1727204606.84553: Set connection var ansible_shell_executable to /bin/sh 51385 1727204606.84588: variable 'ansible_shell_executable' from source: unknown 51385 1727204606.84596: variable 'ansible_connection' from source: unknown 51385 1727204606.84608: variable 'ansible_module_compression' from source: unknown 51385 1727204606.84615: variable 'ansible_shell_type' from source: unknown 51385 1727204606.84622: variable 'ansible_shell_executable' from source: unknown 51385 1727204606.84634: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204606.84642: variable 'ansible_pipelining' from source: unknown 51385 1727204606.84649: variable 'ansible_timeout' from source: unknown 51385 1727204606.84660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204606.84904: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204606.84921: variable 'omit' from source: magic vars 51385 1727204606.84998: starting attempt loop 51385 1727204606.85005: running the handler 51385 1727204606.85132: variable 'ansible_facts' from source: unknown 51385 1727204606.87103: _low_level_execute_command(): starting 51385 1727204606.87292: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204606.88434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204606.88470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204606.88498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204606.88518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204606.88569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204606.88587: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204606.88610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.88628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204606.88639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204606.88649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204606.88662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204606.88679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204606.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204606.88707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204606.88724: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204606.88740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.88829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204606.88855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204606.88874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204606.88968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204606.90607: stdout chunk (state=3): >>>/root <<< 51385 1727204606.90781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204606.90821: stderr chunk (state=3): >>><<< 51385 1727204606.90824: stdout chunk (state=3): >>><<< 51385 1727204606.90845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204606.90857: _low_level_execute_command(): starting 51385 1727204606.90867: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237 `" && echo ansible-tmp-1727204606.9084592-53096-32647654480237="` echo /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237 `" ) && sleep 0' 51385 1727204606.92216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204606.92221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204606.92225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204606.92258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204606.92302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204606.92309: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204606.92319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.92332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204606.92342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204606.92345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204606.92353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204606.92373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204606.92378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204606.92386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204606.92393: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204606.92403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.92486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204606.92493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204606.92499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204606.92588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204606.94449: stdout chunk (state=3): >>>ansible-tmp-1727204606.9084592-53096-32647654480237=/root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237 <<< 51385 1727204606.94572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204606.94651: stderr chunk (state=3): >>><<< 51385 1727204606.94674: stdout chunk (state=3): >>><<< 51385 1727204606.94973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204606.9084592-53096-32647654480237=/root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204606.94981: variable 'ansible_module_compression' from source: unknown 51385 1727204606.94983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 51385 1727204606.94985: variable 'ansible_facts' from source: unknown 51385 1727204606.95077: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/AnsiballZ_systemd.py 51385 1727204606.95702: Sending initial data 51385 1727204606.95705: Sent initial data (155 bytes) 51385 1727204606.96944: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204606.96948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204606.96988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.96991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204606.96993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204606.96995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204606.97061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204606.97086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204606.97178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204606.98886: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204606.98976: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204606.98994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpbvooqngu /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/AnsiballZ_systemd.py <<< 51385 1727204606.99090: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204607.02180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.02256: stderr chunk (state=3): >>><<< 51385 1727204607.02263: stdout chunk (state=3): >>><<< 51385 1727204607.02282: done transferring module to remote 51385 1727204607.02297: _low_level_execute_command(): starting 51385 1727204607.02300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/ /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/AnsiballZ_systemd.py && sleep 0' 51385 1727204607.03589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204607.03596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.03607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.03620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.03657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.03666: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204607.03677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.03691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204607.03699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204607.03707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204607.03712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.03721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.03732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.03739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.03745: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204607.03754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.03826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204607.03840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.03849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.03938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.05682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.05685: stderr chunk (state=3): >>><<< 51385 1727204607.05688: stdout chunk (state=3): >>><<< 51385 1727204607.05706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204607.05709: _low_level_execute_command(): starting 51385 1727204607.05713: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/AnsiballZ_systemd.py && sleep 0' 51385 1727204607.06301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204607.06311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.06322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.06335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.06373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.06379: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204607.06389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.06402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204607.06409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204607.06415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204607.06422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.06431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.06443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.06450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.06461: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204607.06466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.06538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204607.06550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.06567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.06656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.31935: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "70053", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ExecMainStartTimestampMonotonic": "824430121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "70053", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "<<< 51385 1727204607.31975: stdout chunk (state=3): >>>system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5101", "MemoryCurrent": "6262784", "MemoryAvailable": "infinity", "CPUUsageNSec": "194903000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:41 EDT", "StateChangeTimestampMonotonic": "824517223", "InactiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveExitTimestampMonotonic": "824430408", "ActiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveEnterTimestampMonotonic": "824517223", "ActiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveExitTimestampMonotonic": "824386950", "InactiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveEnterTimestampMonotonic": "824423584", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ConditionTimestampMonotonic": "824424123", "AssertTimestamp": "Tue 2024-09-24 15:02:41 EDT", "AssertTimestampMonotonic": "824424125", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1aab9e9897314f7fb6bad2151914424e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 51385 1727204607.33594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204607.33598: stdout chunk (state=3): >>><<< 51385 1727204607.33601: stderr chunk (state=3): >>><<< 51385 1727204607.33772: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "70053", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ExecMainStartTimestampMonotonic": "824430121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "70053", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:02:41 EDT] ; stop_time=[n/a] ; pid=70053 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5101", "MemoryCurrent": "6262784", "MemoryAvailable": "infinity", "CPUUsageNSec": "194903000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:41 EDT", "StateChangeTimestampMonotonic": "824517223", "InactiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveExitTimestampMonotonic": "824430408", "ActiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveEnterTimestampMonotonic": "824517223", "ActiveExitTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ActiveExitTimestampMonotonic": "824386950", "InactiveEnterTimestamp": "Tue 2024-09-24 15:02:41 EDT", "InactiveEnterTimestampMonotonic": "824423584", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:02:41 EDT", "ConditionTimestampMonotonic": "824424123", "AssertTimestamp": "Tue 2024-09-24 15:02:41 EDT", "AssertTimestampMonotonic": "824424125", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1aab9e9897314f7fb6bad2151914424e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204607.33911: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204607.33915: _low_level_execute_command(): starting 51385 1727204607.33917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204606.9084592-53096-32647654480237/ > /dev/null 2>&1 && sleep 0' 51385 1727204607.34511: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204607.34525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.34538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.34554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.34601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.34614: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204607.34627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.34642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204607.34653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204607.34670: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204607.34681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.34693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.34708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.34720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.34730: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204607.34743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.34824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204607.34847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.34872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.34961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.36785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.36883: stderr chunk (state=3): >>><<< 51385 1727204607.36887: stdout chunk (state=3): >>><<< 51385 1727204607.36971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204607.36974: handler run complete 51385 1727204607.37270: attempt loop complete, returning result 51385 1727204607.37273: _execute() done 51385 1727204607.37276: dumping result to json 51385 1727204607.37278: done dumping result, returning 51385 1727204607.37280: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-6b1f-5706-000000000071] 51385 1727204607.37282: sending task result for task 0affcd87-79f5-6b1f-5706-000000000071 51385 1727204607.37403: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000071 51385 1727204607.37407: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204607.37468: no more pending results, returning what we have 51385 1727204607.37472: results queue empty 51385 1727204607.37473: checking for any_errors_fatal 51385 1727204607.37479: done checking for any_errors_fatal 51385 1727204607.37480: checking for max_fail_percentage 51385 1727204607.37482: done checking for max_fail_percentage 51385 1727204607.37483: checking to see if all hosts have failed and the running result is not ok 51385 1727204607.37484: done checking to see if all hosts have failed 51385 1727204607.37485: getting the remaining hosts for this loop 51385 1727204607.37487: done getting the remaining hosts for this loop 51385 1727204607.37491: getting the next task for host managed-node1 51385 1727204607.37498: done getting next task for host managed-node1 51385 1727204607.37502: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51385 1727204607.37505: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204607.37516: getting variables 51385 1727204607.37518: in VariableManager get_vars() 51385 1727204607.37557: Calling all_inventory to load vars for managed-node1 51385 1727204607.37562: Calling groups_inventory to load vars for managed-node1 51385 1727204607.37565: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204607.37575: Calling all_plugins_play to load vars for managed-node1 51385 1727204607.37578: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204607.37580: Calling groups_plugins_play to load vars for managed-node1 51385 1727204607.39382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204607.41122: done with get_vars() 51385 1727204607.41147: done getting variables 51385 1727204607.41207: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.689) 0:00:25.816 ***** 51385 1727204607.41243: entering _queue_task() for managed-node1/service 51385 1727204607.41572: worker is 1 (out of 1 available) 51385 1727204607.41586: exiting _queue_task() for managed-node1/service 51385 1727204607.41599: done queuing things up, now waiting for results queue to drain 51385 1727204607.41600: waiting for pending results... 51385 1727204607.41923: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51385 1727204607.42088: in run() - task 0affcd87-79f5-6b1f-5706-000000000072 51385 1727204607.42111: variable 'ansible_search_path' from source: unknown 51385 1727204607.42119: variable 'ansible_search_path' from source: unknown 51385 1727204607.42169: calling self._execute() 51385 1727204607.42279: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.42291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.42306: variable 'omit' from source: magic vars 51385 1727204607.42703: variable 'ansible_distribution_major_version' from source: facts 51385 1727204607.42721: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204607.42845: variable 'network_provider' from source: set_fact 51385 1727204607.42856: Evaluated conditional (network_provider == "nm"): True 51385 1727204607.42965: variable '__network_wpa_supplicant_required' from source: role '' defaults 51385 1727204607.43065: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 51385 1727204607.43252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204607.45591: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204607.45675: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204607.45715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204607.45762: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204607.45796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204607.45900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204607.45933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204607.45974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204607.46020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204607.46038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204607.46096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204607.46124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204607.46153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204607.46203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204607.46221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204607.46267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204607.46299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204607.46326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204607.46372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204607.46394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204607.46553: variable 'network_connections' from source: task vars 51385 1727204607.46576: variable 'interface' from source: play vars 51385 1727204607.46655: variable 'interface' from source: play vars 51385 1727204607.46675: variable 'vlan_interface' from source: play vars 51385 1727204607.46746: variable 'vlan_interface' from source: play vars 51385 1727204607.46827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204607.47004: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204607.47049: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204607.47088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204607.47120: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204607.47173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51385 1727204607.47200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51385 1727204607.47228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204607.47272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51385 1727204607.47330: variable '__network_wireless_connections_defined' from source: role '' defaults 51385 1727204607.47790: variable 'network_connections' from source: task vars 51385 1727204607.47802: variable 'interface' from source: play vars 51385 1727204607.47873: variable 'interface' from source: play vars 51385 1727204607.47885: variable 'vlan_interface' from source: play vars 51385 1727204607.47951: variable 'vlan_interface' from source: play vars 51385 1727204607.47990: Evaluated conditional (__network_wpa_supplicant_required): False 51385 1727204607.47999: when evaluation is False, skipping this task 51385 1727204607.48015: _execute() done 51385 1727204607.48027: dumping result to json 51385 1727204607.48035: done dumping result, returning 51385 1727204607.48047: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-6b1f-5706-000000000072] 51385 1727204607.48058: sending task result for task 0affcd87-79f5-6b1f-5706-000000000072 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 51385 1727204607.48218: no more pending results, returning what we have 51385 1727204607.48222: results queue empty 51385 1727204607.48224: checking for any_errors_fatal 51385 1727204607.48244: done checking for any_errors_fatal 51385 1727204607.48246: checking for max_fail_percentage 51385 1727204607.48248: done checking for max_fail_percentage 51385 1727204607.48249: checking to see if all hosts have failed and the running result is not ok 51385 1727204607.48250: done checking to see if all hosts have failed 51385 1727204607.48251: getting the remaining hosts for this loop 51385 1727204607.48253: done getting the remaining hosts for this loop 51385 1727204607.48257: getting the next task for host managed-node1 51385 1727204607.48269: done getting next task for host managed-node1 51385 1727204607.48273: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51385 1727204607.48276: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204607.48295: getting variables 51385 1727204607.48297: in VariableManager get_vars() 51385 1727204607.48341: Calling all_inventory to load vars for managed-node1 51385 1727204607.48344: Calling groups_inventory to load vars for managed-node1 51385 1727204607.48347: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204607.48358: Calling all_plugins_play to load vars for managed-node1 51385 1727204607.48365: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204607.48369: Calling groups_plugins_play to load vars for managed-node1 51385 1727204607.49385: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000072 51385 1727204607.49389: WORKER PROCESS EXITING 51385 1727204607.50303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204607.58074: done with get_vars() 51385 1727204607.58127: done getting variables 51385 1727204607.58488: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.172) 0:00:25.989 ***** 51385 1727204607.58518: entering _queue_task() for managed-node1/service 51385 1727204607.58870: worker is 1 (out of 1 available) 51385 1727204607.58883: exiting _queue_task() for managed-node1/service 51385 1727204607.58894: done queuing things up, now waiting for results queue to drain 51385 1727204607.58895: waiting for pending results... 51385 1727204607.59469: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 51385 1727204607.59724: in run() - task 0affcd87-79f5-6b1f-5706-000000000073 51385 1727204607.59728: variable 'ansible_search_path' from source: unknown 51385 1727204607.59851: variable 'ansible_search_path' from source: unknown 51385 1727204607.59896: calling self._execute() 51385 1727204607.60111: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.60117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.60130: variable 'omit' from source: magic vars 51385 1727204607.61011: variable 'ansible_distribution_major_version' from source: facts 51385 1727204607.61021: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204607.61109: variable 'network_provider' from source: set_fact 51385 1727204607.61113: Evaluated conditional (network_provider == "initscripts"): False 51385 1727204607.61115: when evaluation is False, skipping this task 51385 1727204607.61120: _execute() done 51385 1727204607.61123: dumping result to json 51385 1727204607.61126: done dumping result, returning 51385 1727204607.61132: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-6b1f-5706-000000000073] 51385 1727204607.61139: sending task result for task 0affcd87-79f5-6b1f-5706-000000000073 51385 1727204607.61239: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000073 51385 1727204607.61241: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51385 1727204607.61292: no more pending results, returning what we have 51385 1727204607.61296: results queue empty 51385 1727204607.61297: checking for any_errors_fatal 51385 1727204607.61305: done checking for any_errors_fatal 51385 1727204607.61306: checking for max_fail_percentage 51385 1727204607.61308: done checking for max_fail_percentage 51385 1727204607.61309: checking to see if all hosts have failed and the running result is not ok 51385 1727204607.61310: done checking to see if all hosts have failed 51385 1727204607.61311: getting the remaining hosts for this loop 51385 1727204607.61312: done getting the remaining hosts for this loop 51385 1727204607.61315: getting the next task for host managed-node1 51385 1727204607.61322: done getting next task for host managed-node1 51385 1727204607.61326: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51385 1727204607.61329: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204607.61347: getting variables 51385 1727204607.61349: in VariableManager get_vars() 51385 1727204607.61391: Calling all_inventory to load vars for managed-node1 51385 1727204607.61394: Calling groups_inventory to load vars for managed-node1 51385 1727204607.61396: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204607.61406: Calling all_plugins_play to load vars for managed-node1 51385 1727204607.61408: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204607.61411: Calling groups_plugins_play to load vars for managed-node1 51385 1727204607.63242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204607.65477: done with get_vars() 51385 1727204607.65505: done getting variables 51385 1727204607.65574: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.070) 0:00:26.060 ***** 51385 1727204607.65613: entering _queue_task() for managed-node1/copy 51385 1727204607.65956: worker is 1 (out of 1 available) 51385 1727204607.65975: exiting _queue_task() for managed-node1/copy 51385 1727204607.65988: done queuing things up, now waiting for results queue to drain 51385 1727204607.65989: waiting for pending results... 51385 1727204607.66344: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51385 1727204607.66509: in run() - task 0affcd87-79f5-6b1f-5706-000000000074 51385 1727204607.66531: variable 'ansible_search_path' from source: unknown 51385 1727204607.66538: variable 'ansible_search_path' from source: unknown 51385 1727204607.66583: calling self._execute() 51385 1727204607.66689: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.66705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.66719: variable 'omit' from source: magic vars 51385 1727204607.67112: variable 'ansible_distribution_major_version' from source: facts 51385 1727204607.67136: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204607.67262: variable 'network_provider' from source: set_fact 51385 1727204607.67275: Evaluated conditional (network_provider == "initscripts"): False 51385 1727204607.67283: when evaluation is False, skipping this task 51385 1727204607.67291: _execute() done 51385 1727204607.67298: dumping result to json 51385 1727204607.67305: done dumping result, returning 51385 1727204607.67315: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-6b1f-5706-000000000074] 51385 1727204607.67326: sending task result for task 0affcd87-79f5-6b1f-5706-000000000074 51385 1727204607.67441: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000074 51385 1727204607.67447: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 51385 1727204607.67500: no more pending results, returning what we have 51385 1727204607.67504: results queue empty 51385 1727204607.67505: checking for any_errors_fatal 51385 1727204607.67511: done checking for any_errors_fatal 51385 1727204607.67511: checking for max_fail_percentage 51385 1727204607.67513: done checking for max_fail_percentage 51385 1727204607.67514: checking to see if all hosts have failed and the running result is not ok 51385 1727204607.67515: done checking to see if all hosts have failed 51385 1727204607.67515: getting the remaining hosts for this loop 51385 1727204607.67517: done getting the remaining hosts for this loop 51385 1727204607.67521: getting the next task for host managed-node1 51385 1727204607.67527: done getting next task for host managed-node1 51385 1727204607.67531: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51385 1727204607.67534: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204607.67551: getting variables 51385 1727204607.67553: in VariableManager get_vars() 51385 1727204607.67599: Calling all_inventory to load vars for managed-node1 51385 1727204607.67602: Calling groups_inventory to load vars for managed-node1 51385 1727204607.67604: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204607.67616: Calling all_plugins_play to load vars for managed-node1 51385 1727204607.67619: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204607.67621: Calling groups_plugins_play to load vars for managed-node1 51385 1727204607.71463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204607.73211: done with get_vars() 51385 1727204607.73237: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.077) 0:00:26.137 ***** 51385 1727204607.73333: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 51385 1727204607.73678: worker is 1 (out of 1 available) 51385 1727204607.73692: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 51385 1727204607.73703: done queuing things up, now waiting for results queue to drain 51385 1727204607.73704: waiting for pending results... 51385 1727204607.74129: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51385 1727204607.74289: in run() - task 0affcd87-79f5-6b1f-5706-000000000075 51385 1727204607.74339: variable 'ansible_search_path' from source: unknown 51385 1727204607.74348: variable 'ansible_search_path' from source: unknown 51385 1727204607.74399: calling self._execute() 51385 1727204607.74507: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.74519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.74535: variable 'omit' from source: magic vars 51385 1727204607.74942: variable 'ansible_distribution_major_version' from source: facts 51385 1727204607.74963: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204607.74977: variable 'omit' from source: magic vars 51385 1727204607.75038: variable 'omit' from source: magic vars 51385 1727204607.75215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204607.77610: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204607.77701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204607.77749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204607.77791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204607.77826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204607.77925: variable 'network_provider' from source: set_fact 51385 1727204607.78070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204607.78103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204607.78133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204607.78184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204607.78204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204607.78285: variable 'omit' from source: magic vars 51385 1727204607.78410: variable 'omit' from source: magic vars 51385 1727204607.78524: variable 'network_connections' from source: task vars 51385 1727204607.78540: variable 'interface' from source: play vars 51385 1727204607.78610: variable 'interface' from source: play vars 51385 1727204607.78623: variable 'vlan_interface' from source: play vars 51385 1727204607.78688: variable 'vlan_interface' from source: play vars 51385 1727204607.78852: variable 'omit' from source: magic vars 51385 1727204607.78866: variable '__lsr_ansible_managed' from source: task vars 51385 1727204607.78934: variable '__lsr_ansible_managed' from source: task vars 51385 1727204607.79221: Loaded config def from plugin (lookup/template) 51385 1727204607.79233: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 51385 1727204607.79273: File lookup term: get_ansible_managed.j2 51385 1727204607.79281: variable 'ansible_search_path' from source: unknown 51385 1727204607.79290: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 51385 1727204607.79306: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 51385 1727204607.79325: variable 'ansible_search_path' from source: unknown 51385 1727204607.85812: variable 'ansible_managed' from source: unknown 51385 1727204607.85972: variable 'omit' from source: magic vars 51385 1727204607.86008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204607.86038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204607.86060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204607.86084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204607.86097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204607.86126: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204607.86132: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.86138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.86229: Set connection var ansible_pipelining to False 51385 1727204607.86236: Set connection var ansible_shell_type to sh 51385 1727204607.86249: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204607.86260: Set connection var ansible_timeout to 10 51385 1727204607.86271: Set connection var ansible_connection to ssh 51385 1727204607.86281: Set connection var ansible_shell_executable to /bin/sh 51385 1727204607.86314: variable 'ansible_shell_executable' from source: unknown 51385 1727204607.86323: variable 'ansible_connection' from source: unknown 51385 1727204607.86329: variable 'ansible_module_compression' from source: unknown 51385 1727204607.86336: variable 'ansible_shell_type' from source: unknown 51385 1727204607.86342: variable 'ansible_shell_executable' from source: unknown 51385 1727204607.86348: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204607.86356: variable 'ansible_pipelining' from source: unknown 51385 1727204607.86361: variable 'ansible_timeout' from source: unknown 51385 1727204607.86372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204607.86522: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204607.86547: variable 'omit' from source: magic vars 51385 1727204607.86560: starting attempt loop 51385 1727204607.86570: running the handler 51385 1727204607.86589: _low_level_execute_command(): starting 51385 1727204607.86600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204607.87372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204607.87394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.87411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.87430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.87477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.87494: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204607.87510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.87529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204607.87541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204607.87553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204607.87569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.87584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.87603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.87617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.87628: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204607.87645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.87724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204607.87748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.87767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.87861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.89504: stdout chunk (state=3): >>>/root <<< 51385 1727204607.89607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.89716: stderr chunk (state=3): >>><<< 51385 1727204607.89731: stdout chunk (state=3): >>><<< 51385 1727204607.89862: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204607.89869: _low_level_execute_command(): starting 51385 1727204607.89873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283 `" && echo ansible-tmp-1727204607.8976707-53143-74977367335283="` echo /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283 `" ) && sleep 0' 51385 1727204607.90450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204607.90466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.90482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.90500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.90547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.90558: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204607.90575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.90592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204607.90602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204607.90611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204607.90622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204607.90635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.90654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.90666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204607.90677: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204607.90689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.90761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204607.90786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.90802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.90896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.92747: stdout chunk (state=3): >>>ansible-tmp-1727204607.8976707-53143-74977367335283=/root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283 <<< 51385 1727204607.92958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.92962: stdout chunk (state=3): >>><<< 51385 1727204607.92966: stderr chunk (state=3): >>><<< 51385 1727204607.93309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204607.8976707-53143-74977367335283=/root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204607.93317: variable 'ansible_module_compression' from source: unknown 51385 1727204607.93320: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 51385 1727204607.93322: variable 'ansible_facts' from source: unknown 51385 1727204607.93324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/AnsiballZ_network_connections.py 51385 1727204607.93856: Sending initial data 51385 1727204607.93860: Sent initial data (167 bytes) 51385 1727204607.94925: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.94930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204607.94959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.94963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204607.94968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204607.95048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204607.95063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204607.95157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204607.96880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204607.96923: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204607.96981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmp6aqpcf6p /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/AnsiballZ_network_connections.py <<< 51385 1727204607.97033: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204607.99017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204607.99171: stderr chunk (state=3): >>><<< 51385 1727204607.99175: stdout chunk (state=3): >>><<< 51385 1727204607.99177: done transferring module to remote 51385 1727204607.99180: _low_level_execute_command(): starting 51385 1727204607.99182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/ /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/AnsiballZ_network_connections.py && sleep 0' 51385 1727204608.00875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204608.00938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.00952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.00972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.01015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.01148: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204608.01162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.01181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204608.01191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204608.01199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204608.01209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.01219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.01231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.01243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.01254: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204608.01272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.01348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.01477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.01494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.01585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.03387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204608.03391: stdout chunk (state=3): >>><<< 51385 1727204608.03393: stderr chunk (state=3): >>><<< 51385 1727204608.03498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204608.03502: _low_level_execute_command(): starting 51385 1727204608.03504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/AnsiballZ_network_connections.py && sleep 0' 51385 1727204608.05118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.05122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.05273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204608.05278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.05280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204608.05292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.05420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.05469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.05680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.45595: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 51385 1727204608.45643: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/1618f11a-f530-4474-ba61-deb3b396c4a9: error=unknown <<< 51385 1727204608.47342: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 51385 1727204608.47388: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/b2e59a26-dd89-4665-aa15-863b790a948c: error=unknown <<< 51385 1727204608.47602: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 51385 1727204608.49257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204608.49261: stdout chunk (state=3): >>><<< 51385 1727204608.49274: stderr chunk (state=3): >>><<< 51385 1727204608.49294: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/1618f11a-f530-4474-ba61-deb3b396c4a9: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jso6o2ud/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/b2e59a26-dd89-4665-aa15-863b790a948c: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204608.49339: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204608.49351: _low_level_execute_command(): starting 51385 1727204608.49354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204607.8976707-53143-74977367335283/ > /dev/null 2>&1 && sleep 0' 51385 1727204608.50012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204608.50022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.50031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.50045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.50094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.50101: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204608.50110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.50125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204608.50130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204608.50137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204608.50143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.50152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.50167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.50176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.50183: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204608.50192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.50262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.50285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.50297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.50383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.52294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204608.52298: stdout chunk (state=3): >>><<< 51385 1727204608.52303: stderr chunk (state=3): >>><<< 51385 1727204608.52349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204608.52353: handler run complete 51385 1727204608.52391: attempt loop complete, returning result 51385 1727204608.52394: _execute() done 51385 1727204608.52396: dumping result to json 51385 1727204608.52401: done dumping result, returning 51385 1727204608.52411: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-6b1f-5706-000000000075] 51385 1727204608.52418: sending task result for task 0affcd87-79f5-6b1f-5706-000000000075 51385 1727204608.52534: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000075 51385 1727204608.52537: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 51385 1727204608.52788: no more pending results, returning what we have 51385 1727204608.52792: results queue empty 51385 1727204608.52793: checking for any_errors_fatal 51385 1727204608.52799: done checking for any_errors_fatal 51385 1727204608.52800: checking for max_fail_percentage 51385 1727204608.52802: done checking for max_fail_percentage 51385 1727204608.52804: checking to see if all hosts have failed and the running result is not ok 51385 1727204608.52804: done checking to see if all hosts have failed 51385 1727204608.52805: getting the remaining hosts for this loop 51385 1727204608.52807: done getting the remaining hosts for this loop 51385 1727204608.52811: getting the next task for host managed-node1 51385 1727204608.52817: done getting next task for host managed-node1 51385 1727204608.52820: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51385 1727204608.52823: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204608.52835: getting variables 51385 1727204608.52837: in VariableManager get_vars() 51385 1727204608.52886: Calling all_inventory to load vars for managed-node1 51385 1727204608.52889: Calling groups_inventory to load vars for managed-node1 51385 1727204608.52892: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204608.52902: Calling all_plugins_play to load vars for managed-node1 51385 1727204608.52905: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204608.52908: Calling groups_plugins_play to load vars for managed-node1 51385 1727204608.54784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204608.55734: done with get_vars() 51385 1727204608.55754: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.824) 0:00:26.962 ***** 51385 1727204608.55822: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 51385 1727204608.56059: worker is 1 (out of 1 available) 51385 1727204608.56075: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 51385 1727204608.56088: done queuing things up, now waiting for results queue to drain 51385 1727204608.56089: waiting for pending results... 51385 1727204608.56430: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 51385 1727204608.56556: in run() - task 0affcd87-79f5-6b1f-5706-000000000076 51385 1727204608.56560: variable 'ansible_search_path' from source: unknown 51385 1727204608.56563: variable 'ansible_search_path' from source: unknown 51385 1727204608.56570: calling self._execute() 51385 1727204608.56673: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.56683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.56699: variable 'omit' from source: magic vars 51385 1727204608.57115: variable 'ansible_distribution_major_version' from source: facts 51385 1727204608.57134: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204608.57306: variable 'network_state' from source: role '' defaults 51385 1727204608.57325: Evaluated conditional (network_state != {}): False 51385 1727204608.57342: when evaluation is False, skipping this task 51385 1727204608.57345: _execute() done 51385 1727204608.57350: dumping result to json 51385 1727204608.57353: done dumping result, returning 51385 1727204608.57366: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-6b1f-5706-000000000076] 51385 1727204608.57381: sending task result for task 0affcd87-79f5-6b1f-5706-000000000076 51385 1727204608.57512: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000076 51385 1727204608.57514: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 51385 1727204608.57563: no more pending results, returning what we have 51385 1727204608.57569: results queue empty 51385 1727204608.57570: checking for any_errors_fatal 51385 1727204608.57580: done checking for any_errors_fatal 51385 1727204608.57581: checking for max_fail_percentage 51385 1727204608.57583: done checking for max_fail_percentage 51385 1727204608.57584: checking to see if all hosts have failed and the running result is not ok 51385 1727204608.57585: done checking to see if all hosts have failed 51385 1727204608.57585: getting the remaining hosts for this loop 51385 1727204608.57587: done getting the remaining hosts for this loop 51385 1727204608.57591: getting the next task for host managed-node1 51385 1727204608.57598: done getting next task for host managed-node1 51385 1727204608.57602: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51385 1727204608.57605: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204608.57623: getting variables 51385 1727204608.57625: in VariableManager get_vars() 51385 1727204608.57667: Calling all_inventory to load vars for managed-node1 51385 1727204608.57671: Calling groups_inventory to load vars for managed-node1 51385 1727204608.57674: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204608.57684: Calling all_plugins_play to load vars for managed-node1 51385 1727204608.57686: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204608.57689: Calling groups_plugins_play to load vars for managed-node1 51385 1727204608.58698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204608.59924: done with get_vars() 51385 1727204608.59945: done getting variables 51385 1727204608.60412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.046) 0:00:27.008 ***** 51385 1727204608.60450: entering _queue_task() for managed-node1/debug 51385 1727204608.60789: worker is 1 (out of 1 available) 51385 1727204608.60802: exiting _queue_task() for managed-node1/debug 51385 1727204608.60820: done queuing things up, now waiting for results queue to drain 51385 1727204608.60822: waiting for pending results... 51385 1727204608.61181: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51385 1727204608.61292: in run() - task 0affcd87-79f5-6b1f-5706-000000000077 51385 1727204608.61305: variable 'ansible_search_path' from source: unknown 51385 1727204608.61309: variable 'ansible_search_path' from source: unknown 51385 1727204608.61345: calling self._execute() 51385 1727204608.61444: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.61448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.61460: variable 'omit' from source: magic vars 51385 1727204608.62612: variable 'ansible_distribution_major_version' from source: facts 51385 1727204608.62630: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204608.62641: variable 'omit' from source: magic vars 51385 1727204608.63000: variable 'omit' from source: magic vars 51385 1727204608.63036: variable 'omit' from source: magic vars 51385 1727204608.63078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204608.63123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204608.63146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204608.63169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.63186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.63228: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204608.63236: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.63244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.63362: Set connection var ansible_pipelining to False 51385 1727204608.63373: Set connection var ansible_shell_type to sh 51385 1727204608.63389: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204608.63400: Set connection var ansible_timeout to 10 51385 1727204608.63409: Set connection var ansible_connection to ssh 51385 1727204608.63426: Set connection var ansible_shell_executable to /bin/sh 51385 1727204608.63454: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.63461: variable 'ansible_connection' from source: unknown 51385 1727204608.63472: variable 'ansible_module_compression' from source: unknown 51385 1727204608.63479: variable 'ansible_shell_type' from source: unknown 51385 1727204608.63485: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.63491: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.63498: variable 'ansible_pipelining' from source: unknown 51385 1727204608.63505: variable 'ansible_timeout' from source: unknown 51385 1727204608.63512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.63667: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204608.63684: variable 'omit' from source: magic vars 51385 1727204608.63692: starting attempt loop 51385 1727204608.63697: running the handler 51385 1727204608.63822: variable '__network_connections_result' from source: set_fact 51385 1727204608.63886: handler run complete 51385 1727204608.63907: attempt loop complete, returning result 51385 1727204608.63913: _execute() done 51385 1727204608.63919: dumping result to json 51385 1727204608.63925: done dumping result, returning 51385 1727204608.63936: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-6b1f-5706-000000000077] 51385 1727204608.63945: sending task result for task 0affcd87-79f5-6b1f-5706-000000000077 51385 1727204608.64057: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000077 51385 1727204608.64071: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 51385 1727204608.64140: no more pending results, returning what we have 51385 1727204608.64144: results queue empty 51385 1727204608.64145: checking for any_errors_fatal 51385 1727204608.64152: done checking for any_errors_fatal 51385 1727204608.64152: checking for max_fail_percentage 51385 1727204608.64154: done checking for max_fail_percentage 51385 1727204608.64155: checking to see if all hosts have failed and the running result is not ok 51385 1727204608.64156: done checking to see if all hosts have failed 51385 1727204608.64157: getting the remaining hosts for this loop 51385 1727204608.64159: done getting the remaining hosts for this loop 51385 1727204608.64163: getting the next task for host managed-node1 51385 1727204608.64172: done getting next task for host managed-node1 51385 1727204608.64177: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51385 1727204608.64181: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204608.64196: getting variables 51385 1727204608.64198: in VariableManager get_vars() 51385 1727204608.64240: Calling all_inventory to load vars for managed-node1 51385 1727204608.64243: Calling groups_inventory to load vars for managed-node1 51385 1727204608.64245: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204608.64255: Calling all_plugins_play to load vars for managed-node1 51385 1727204608.64258: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204608.64261: Calling groups_plugins_play to load vars for managed-node1 51385 1727204608.68125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204608.70780: done with get_vars() 51385 1727204608.70804: done getting variables 51385 1727204608.70879: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.104) 0:00:27.113 ***** 51385 1727204608.70915: entering _queue_task() for managed-node1/debug 51385 1727204608.71497: worker is 1 (out of 1 available) 51385 1727204608.71515: exiting _queue_task() for managed-node1/debug 51385 1727204608.71527: done queuing things up, now waiting for results queue to drain 51385 1727204608.71528: waiting for pending results... 51385 1727204608.72338: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51385 1727204608.72488: in run() - task 0affcd87-79f5-6b1f-5706-000000000078 51385 1727204608.72654: variable 'ansible_search_path' from source: unknown 51385 1727204608.72706: variable 'ansible_search_path' from source: unknown 51385 1727204608.72755: calling self._execute() 51385 1727204608.73111: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.73143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.73207: variable 'omit' from source: magic vars 51385 1727204608.73607: variable 'ansible_distribution_major_version' from source: facts 51385 1727204608.73631: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204608.73642: variable 'omit' from source: magic vars 51385 1727204608.73703: variable 'omit' from source: magic vars 51385 1727204608.73750: variable 'omit' from source: magic vars 51385 1727204608.73794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204608.73841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204608.73868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204608.73890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.73905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.73938: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204608.73950: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.73957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.74061: Set connection var ansible_pipelining to False 51385 1727204608.74072: Set connection var ansible_shell_type to sh 51385 1727204608.74086: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204608.74097: Set connection var ansible_timeout to 10 51385 1727204608.74103: Set connection var ansible_connection to ssh 51385 1727204608.74111: Set connection var ansible_shell_executable to /bin/sh 51385 1727204608.74135: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.74141: variable 'ansible_connection' from source: unknown 51385 1727204608.74147: variable 'ansible_module_compression' from source: unknown 51385 1727204608.74153: variable 'ansible_shell_type' from source: unknown 51385 1727204608.74166: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.74175: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.74181: variable 'ansible_pipelining' from source: unknown 51385 1727204608.74187: variable 'ansible_timeout' from source: unknown 51385 1727204608.74194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.74331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204608.74348: variable 'omit' from source: magic vars 51385 1727204608.74357: starting attempt loop 51385 1727204608.74365: running the handler 51385 1727204608.74423: variable '__network_connections_result' from source: set_fact 51385 1727204608.74514: variable '__network_connections_result' from source: set_fact 51385 1727204608.74640: handler run complete 51385 1727204608.74674: attempt loop complete, returning result 51385 1727204608.74682: _execute() done 51385 1727204608.74687: dumping result to json 51385 1727204608.74695: done dumping result, returning 51385 1727204608.74714: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-6b1f-5706-000000000078] 51385 1727204608.74725: sending task result for task 0affcd87-79f5-6b1f-5706-000000000078 ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 51385 1727204608.74923: no more pending results, returning what we have 51385 1727204608.74927: results queue empty 51385 1727204608.74928: checking for any_errors_fatal 51385 1727204608.74936: done checking for any_errors_fatal 51385 1727204608.74936: checking for max_fail_percentage 51385 1727204608.74940: done checking for max_fail_percentage 51385 1727204608.74941: checking to see if all hosts have failed and the running result is not ok 51385 1727204608.74942: done checking to see if all hosts have failed 51385 1727204608.74943: getting the remaining hosts for this loop 51385 1727204608.74945: done getting the remaining hosts for this loop 51385 1727204608.74949: getting the next task for host managed-node1 51385 1727204608.74955: done getting next task for host managed-node1 51385 1727204608.74959: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51385 1727204608.74963: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204608.74978: getting variables 51385 1727204608.74980: in VariableManager get_vars() 51385 1727204608.75022: Calling all_inventory to load vars for managed-node1 51385 1727204608.75025: Calling groups_inventory to load vars for managed-node1 51385 1727204608.75028: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204608.75039: Calling all_plugins_play to load vars for managed-node1 51385 1727204608.75042: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204608.75045: Calling groups_plugins_play to load vars for managed-node1 51385 1727204608.76006: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000078 51385 1727204608.76009: WORKER PROCESS EXITING 51385 1727204608.76860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204608.78615: done with get_vars() 51385 1727204608.78646: done getting variables 51385 1727204608.78707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.078) 0:00:27.191 ***** 51385 1727204608.78754: entering _queue_task() for managed-node1/debug 51385 1727204608.79071: worker is 1 (out of 1 available) 51385 1727204608.79083: exiting _queue_task() for managed-node1/debug 51385 1727204608.79095: done queuing things up, now waiting for results queue to drain 51385 1727204608.79096: waiting for pending results... 51385 1727204608.79393: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51385 1727204608.79544: in run() - task 0affcd87-79f5-6b1f-5706-000000000079 51385 1727204608.79567: variable 'ansible_search_path' from source: unknown 51385 1727204608.79575: variable 'ansible_search_path' from source: unknown 51385 1727204608.79617: calling self._execute() 51385 1727204608.79730: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.79744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.79760: variable 'omit' from source: magic vars 51385 1727204608.80143: variable 'ansible_distribution_major_version' from source: facts 51385 1727204608.80159: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204608.80296: variable 'network_state' from source: role '' defaults 51385 1727204608.80310: Evaluated conditional (network_state != {}): False 51385 1727204608.80317: when evaluation is False, skipping this task 51385 1727204608.80323: _execute() done 51385 1727204608.80330: dumping result to json 51385 1727204608.80337: done dumping result, returning 51385 1727204608.80346: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-6b1f-5706-000000000079] 51385 1727204608.80356: sending task result for task 0affcd87-79f5-6b1f-5706-000000000079 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 51385 1727204608.80506: no more pending results, returning what we have 51385 1727204608.80510: results queue empty 51385 1727204608.80511: checking for any_errors_fatal 51385 1727204608.80517: done checking for any_errors_fatal 51385 1727204608.80518: checking for max_fail_percentage 51385 1727204608.80520: done checking for max_fail_percentage 51385 1727204608.80521: checking to see if all hosts have failed and the running result is not ok 51385 1727204608.80522: done checking to see if all hosts have failed 51385 1727204608.80523: getting the remaining hosts for this loop 51385 1727204608.80525: done getting the remaining hosts for this loop 51385 1727204608.80529: getting the next task for host managed-node1 51385 1727204608.80535: done getting next task for host managed-node1 51385 1727204608.80540: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51385 1727204608.80544: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204608.80566: getting variables 51385 1727204608.80568: in VariableManager get_vars() 51385 1727204608.80611: Calling all_inventory to load vars for managed-node1 51385 1727204608.80613: Calling groups_inventory to load vars for managed-node1 51385 1727204608.80616: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204608.80629: Calling all_plugins_play to load vars for managed-node1 51385 1727204608.80632: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204608.80635: Calling groups_plugins_play to load vars for managed-node1 51385 1727204608.81605: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000079 51385 1727204608.81609: WORKER PROCESS EXITING 51385 1727204608.82605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204608.84282: done with get_vars() 51385 1727204608.84308: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.056) 0:00:27.248 ***** 51385 1727204608.84420: entering _queue_task() for managed-node1/ping 51385 1727204608.84780: worker is 1 (out of 1 available) 51385 1727204608.84793: exiting _queue_task() for managed-node1/ping 51385 1727204608.84805: done queuing things up, now waiting for results queue to drain 51385 1727204608.84806: waiting for pending results... 51385 1727204608.85117: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 51385 1727204608.85272: in run() - task 0affcd87-79f5-6b1f-5706-00000000007a 51385 1727204608.85293: variable 'ansible_search_path' from source: unknown 51385 1727204608.85300: variable 'ansible_search_path' from source: unknown 51385 1727204608.85342: calling self._execute() 51385 1727204608.85442: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.85452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.85473: variable 'omit' from source: magic vars 51385 1727204608.85863: variable 'ansible_distribution_major_version' from source: facts 51385 1727204608.85881: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204608.85893: variable 'omit' from source: magic vars 51385 1727204608.85948: variable 'omit' from source: magic vars 51385 1727204608.85993: variable 'omit' from source: magic vars 51385 1727204608.86042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204608.86088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204608.86117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204608.86138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.86153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204608.86191: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204608.86199: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.86206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.86315: Set connection var ansible_pipelining to False 51385 1727204608.86323: Set connection var ansible_shell_type to sh 51385 1727204608.86341: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204608.86352: Set connection var ansible_timeout to 10 51385 1727204608.86358: Set connection var ansible_connection to ssh 51385 1727204608.86368: Set connection var ansible_shell_executable to /bin/sh 51385 1727204608.86399: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.86407: variable 'ansible_connection' from source: unknown 51385 1727204608.86416: variable 'ansible_module_compression' from source: unknown 51385 1727204608.86422: variable 'ansible_shell_type' from source: unknown 51385 1727204608.86427: variable 'ansible_shell_executable' from source: unknown 51385 1727204608.86433: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204608.86444: variable 'ansible_pipelining' from source: unknown 51385 1727204608.86450: variable 'ansible_timeout' from source: unknown 51385 1727204608.86456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204608.86670: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51385 1727204608.86687: variable 'omit' from source: magic vars 51385 1727204608.86696: starting attempt loop 51385 1727204608.86702: running the handler 51385 1727204608.86724: _low_level_execute_command(): starting 51385 1727204608.86735: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204608.87519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204608.87535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.87549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.87569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.87617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.87631: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204608.87647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.87670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204608.87683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204608.87693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204608.87708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.87722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.87736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.87751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.87761: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204608.87778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.87861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.87880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.87895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.87997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.89620: stdout chunk (state=3): >>>/root <<< 51385 1727204608.89812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204608.89815: stdout chunk (state=3): >>><<< 51385 1727204608.89818: stderr chunk (state=3): >>><<< 51385 1727204608.89927: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204608.89931: _low_level_execute_command(): starting 51385 1727204608.89935: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520 `" && echo ansible-tmp-1727204608.8983984-53194-139962201488520="` echo /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520 `" ) && sleep 0' 51385 1727204608.90586: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204608.90600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.90614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.90631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.90675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.90686: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204608.90699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.90720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204608.90733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204608.90744: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204608.90755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.90770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.90785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.90796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.90805: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204608.90817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.90897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.90920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.90938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.91028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.92869: stdout chunk (state=3): >>>ansible-tmp-1727204608.8983984-53194-139962201488520=/root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520 <<< 51385 1727204608.93066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204608.93070: stdout chunk (state=3): >>><<< 51385 1727204608.93073: stderr chunk (state=3): >>><<< 51385 1727204608.93371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.8983984-53194-139962201488520=/root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204608.93375: variable 'ansible_module_compression' from source: unknown 51385 1727204608.93377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 51385 1727204608.93380: variable 'ansible_facts' from source: unknown 51385 1727204608.93382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/AnsiballZ_ping.py 51385 1727204608.93475: Sending initial data 51385 1727204608.93478: Sent initial data (153 bytes) 51385 1727204608.94456: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204608.94477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204608.94493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.94512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.94557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204608.94591: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204608.94595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 51385 1727204608.94597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.94600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.94721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.94724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204608.94726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.94802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204608.96508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204608.96561: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204608.96609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpx78w4zj3 /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/AnsiballZ_ping.py <<< 51385 1727204608.96664: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204608.97689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204608.97781: stderr chunk (state=3): >>><<< 51385 1727204608.97784: stdout chunk (state=3): >>><<< 51385 1727204608.97810: done transferring module to remote 51385 1727204608.97822: _low_level_execute_command(): starting 51385 1727204608.97827: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/ /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/AnsiballZ_ping.py && sleep 0' 51385 1727204608.98443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.98450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204608.98485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.98502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204608.98505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204608.98562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204608.98572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204608.98625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.00417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.00421: stderr chunk (state=3): >>><<< 51385 1727204609.00423: stdout chunk (state=3): >>><<< 51385 1727204609.00444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.00448: _low_level_execute_command(): starting 51385 1727204609.00452: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/AnsiballZ_ping.py && sleep 0' 51385 1727204609.01048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204609.01058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.01073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.01084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.01123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.01130: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.01141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.01155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.01168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.01174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204609.01184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.01191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.01203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.01211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.01217: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204609.01227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.01304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.01323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.01335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.01422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.14305: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 51385 1727204609.15204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204609.15263: stderr chunk (state=3): >>><<< 51385 1727204609.15268: stdout chunk (state=3): >>><<< 51385 1727204609.15288: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204609.15309: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204609.15317: _low_level_execute_command(): starting 51385 1727204609.15322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.8983984-53194-139962201488520/ > /dev/null 2>&1 && sleep 0' 51385 1727204609.15795: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.15799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.15854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204609.15857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.15859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.15862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.15923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.15928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.15934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.15989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.17731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.17787: stderr chunk (state=3): >>><<< 51385 1727204609.17790: stdout chunk (state=3): >>><<< 51385 1727204609.17804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.17812: handler run complete 51385 1727204609.17824: attempt loop complete, returning result 51385 1727204609.17827: _execute() done 51385 1727204609.17829: dumping result to json 51385 1727204609.17831: done dumping result, returning 51385 1727204609.17839: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-6b1f-5706-00000000007a] 51385 1727204609.17844: sending task result for task 0affcd87-79f5-6b1f-5706-00000000007a 51385 1727204609.17934: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000007a 51385 1727204609.17937: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 51385 1727204609.18035: no more pending results, returning what we have 51385 1727204609.18040: results queue empty 51385 1727204609.18041: checking for any_errors_fatal 51385 1727204609.18049: done checking for any_errors_fatal 51385 1727204609.18049: checking for max_fail_percentage 51385 1727204609.18051: done checking for max_fail_percentage 51385 1727204609.18052: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.18053: done checking to see if all hosts have failed 51385 1727204609.18053: getting the remaining hosts for this loop 51385 1727204609.18055: done getting the remaining hosts for this loop 51385 1727204609.18061: getting the next task for host managed-node1 51385 1727204609.18074: done getting next task for host managed-node1 51385 1727204609.18077: ^ task is: TASK: meta (role_complete) 51385 1727204609.18080: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.18090: getting variables 51385 1727204609.18092: in VariableManager get_vars() 51385 1727204609.18130: Calling all_inventory to load vars for managed-node1 51385 1727204609.18132: Calling groups_inventory to load vars for managed-node1 51385 1727204609.18134: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.18143: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.18145: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.18148: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.18984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.19928: done with get_vars() 51385 1727204609.19946: done getting variables 51385 1727204609.20009: done queuing things up, now waiting for results queue to drain 51385 1727204609.20010: results queue empty 51385 1727204609.20011: checking for any_errors_fatal 51385 1727204609.20013: done checking for any_errors_fatal 51385 1727204609.20013: checking for max_fail_percentage 51385 1727204609.20015: done checking for max_fail_percentage 51385 1727204609.20015: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.20016: done checking to see if all hosts have failed 51385 1727204609.20017: getting the remaining hosts for this loop 51385 1727204609.20017: done getting the remaining hosts for this loop 51385 1727204609.20019: getting the next task for host managed-node1 51385 1727204609.20022: done getting next task for host managed-node1 51385 1727204609.20024: ^ task is: TASK: Include the task 'manage_test_interface.yml' 51385 1727204609.20025: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.20027: getting variables 51385 1727204609.20028: in VariableManager get_vars() 51385 1727204609.20040: Calling all_inventory to load vars for managed-node1 51385 1727204609.20041: Calling groups_inventory to load vars for managed-node1 51385 1727204609.20042: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.20046: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.20048: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.20049: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.20815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.21765: done with get_vars() 51385 1727204609.21780: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.374) 0:00:27.622 ***** 51385 1727204609.21835: entering _queue_task() for managed-node1/include_tasks 51385 1727204609.22220: worker is 1 (out of 1 available) 51385 1727204609.22234: exiting _queue_task() for managed-node1/include_tasks 51385 1727204609.22246: done queuing things up, now waiting for results queue to drain 51385 1727204609.22247: waiting for pending results... 51385 1727204609.22544: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 51385 1727204609.22629: in run() - task 0affcd87-79f5-6b1f-5706-0000000000aa 51385 1727204609.22640: variable 'ansible_search_path' from source: unknown 51385 1727204609.22679: calling self._execute() 51385 1727204609.22777: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.22783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.22793: variable 'omit' from source: magic vars 51385 1727204609.23191: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.23203: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.23209: _execute() done 51385 1727204609.23212: dumping result to json 51385 1727204609.23216: done dumping result, returning 51385 1727204609.23222: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-6b1f-5706-0000000000aa] 51385 1727204609.23229: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000aa 51385 1727204609.23329: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000aa 51385 1727204609.23332: WORKER PROCESS EXITING 51385 1727204609.23370: no more pending results, returning what we have 51385 1727204609.23377: in VariableManager get_vars() 51385 1727204609.23427: Calling all_inventory to load vars for managed-node1 51385 1727204609.23431: Calling groups_inventory to load vars for managed-node1 51385 1727204609.23434: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.23453: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.23456: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.23460: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.24346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.25441: done with get_vars() 51385 1727204609.25460: variable 'ansible_search_path' from source: unknown 51385 1727204609.25475: we have included files to process 51385 1727204609.25476: generating all_blocks data 51385 1727204609.25478: done generating all_blocks data 51385 1727204609.25484: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204609.25485: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204609.25488: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 51385 1727204609.25829: in VariableManager get_vars() 51385 1727204609.25849: done with get_vars() 51385 1727204609.26467: done processing included file 51385 1727204609.26469: iterating over new_blocks loaded from include file 51385 1727204609.26471: in VariableManager get_vars() 51385 1727204609.26488: done with get_vars() 51385 1727204609.26490: filtering new block on tags 51385 1727204609.26522: done filtering new block on tags 51385 1727204609.26525: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 51385 1727204609.26530: extending task lists for all hosts with included blocks 51385 1727204609.29430: done extending task lists 51385 1727204609.29432: done processing included files 51385 1727204609.29433: results queue empty 51385 1727204609.29434: checking for any_errors_fatal 51385 1727204609.29435: done checking for any_errors_fatal 51385 1727204609.29436: checking for max_fail_percentage 51385 1727204609.29437: done checking for max_fail_percentage 51385 1727204609.29438: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.29439: done checking to see if all hosts have failed 51385 1727204609.29440: getting the remaining hosts for this loop 51385 1727204609.29441: done getting the remaining hosts for this loop 51385 1727204609.29444: getting the next task for host managed-node1 51385 1727204609.29447: done getting next task for host managed-node1 51385 1727204609.29449: ^ task is: TASK: Ensure state in ["present", "absent"] 51385 1727204609.29452: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.29455: getting variables 51385 1727204609.29456: in VariableManager get_vars() 51385 1727204609.29473: Calling all_inventory to load vars for managed-node1 51385 1727204609.29475: Calling groups_inventory to load vars for managed-node1 51385 1727204609.29477: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.29484: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.29486: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.29489: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.30718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.32400: done with get_vars() 51385 1727204609.32424: done getting variables 51385 1727204609.32473: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.106) 0:00:27.728 ***** 51385 1727204609.32503: entering _queue_task() for managed-node1/fail 51385 1727204609.32833: worker is 1 (out of 1 available) 51385 1727204609.32846: exiting _queue_task() for managed-node1/fail 51385 1727204609.32859: done queuing things up, now waiting for results queue to drain 51385 1727204609.32861: waiting for pending results... 51385 1727204609.33166: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 51385 1727204609.33251: in run() - task 0affcd87-79f5-6b1f-5706-00000000093c 51385 1727204609.33267: variable 'ansible_search_path' from source: unknown 51385 1727204609.33271: variable 'ansible_search_path' from source: unknown 51385 1727204609.33302: calling self._execute() 51385 1727204609.33395: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.33398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.33408: variable 'omit' from source: magic vars 51385 1727204609.33811: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.33824: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.33969: variable 'state' from source: include params 51385 1727204609.33977: Evaluated conditional (state not in ["present", "absent"]): False 51385 1727204609.33981: when evaluation is False, skipping this task 51385 1727204609.33983: _execute() done 51385 1727204609.33986: dumping result to json 51385 1727204609.33990: done dumping result, returning 51385 1727204609.33998: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-6b1f-5706-00000000093c] 51385 1727204609.34001: sending task result for task 0affcd87-79f5-6b1f-5706-00000000093c 51385 1727204609.34094: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000093c 51385 1727204609.34098: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 51385 1727204609.34151: no more pending results, returning what we have 51385 1727204609.34156: results queue empty 51385 1727204609.34157: checking for any_errors_fatal 51385 1727204609.34159: done checking for any_errors_fatal 51385 1727204609.34160: checking for max_fail_percentage 51385 1727204609.34161: done checking for max_fail_percentage 51385 1727204609.34163: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.34165: done checking to see if all hosts have failed 51385 1727204609.34166: getting the remaining hosts for this loop 51385 1727204609.34168: done getting the remaining hosts for this loop 51385 1727204609.34172: getting the next task for host managed-node1 51385 1727204609.34180: done getting next task for host managed-node1 51385 1727204609.34182: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 51385 1727204609.34187: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.34191: getting variables 51385 1727204609.34194: in VariableManager get_vars() 51385 1727204609.34240: Calling all_inventory to load vars for managed-node1 51385 1727204609.34245: Calling groups_inventory to load vars for managed-node1 51385 1727204609.34247: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.34263: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.34268: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.34272: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.36042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.37673: done with get_vars() 51385 1727204609.37700: done getting variables 51385 1727204609.37761: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.052) 0:00:27.781 ***** 51385 1727204609.37796: entering _queue_task() for managed-node1/fail 51385 1727204609.38111: worker is 1 (out of 1 available) 51385 1727204609.38122: exiting _queue_task() for managed-node1/fail 51385 1727204609.38134: done queuing things up, now waiting for results queue to drain 51385 1727204609.38135: waiting for pending results... 51385 1727204609.38422: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 51385 1727204609.38522: in run() - task 0affcd87-79f5-6b1f-5706-00000000093d 51385 1727204609.38535: variable 'ansible_search_path' from source: unknown 51385 1727204609.38539: variable 'ansible_search_path' from source: unknown 51385 1727204609.38577: calling self._execute() 51385 1727204609.38675: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.38681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.38695: variable 'omit' from source: magic vars 51385 1727204609.39072: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.39086: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.39241: variable 'type' from source: play vars 51385 1727204609.39248: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 51385 1727204609.39251: when evaluation is False, skipping this task 51385 1727204609.39254: _execute() done 51385 1727204609.39257: dumping result to json 51385 1727204609.39262: done dumping result, returning 51385 1727204609.39266: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-6b1f-5706-00000000093d] 51385 1727204609.39275: sending task result for task 0affcd87-79f5-6b1f-5706-00000000093d skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 51385 1727204609.39415: no more pending results, returning what we have 51385 1727204609.39420: results queue empty 51385 1727204609.39421: checking for any_errors_fatal 51385 1727204609.39429: done checking for any_errors_fatal 51385 1727204609.39429: checking for max_fail_percentage 51385 1727204609.39431: done checking for max_fail_percentage 51385 1727204609.39433: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.39433: done checking to see if all hosts have failed 51385 1727204609.39434: getting the remaining hosts for this loop 51385 1727204609.39436: done getting the remaining hosts for this loop 51385 1727204609.39440: getting the next task for host managed-node1 51385 1727204609.39447: done getting next task for host managed-node1 51385 1727204609.39450: ^ task is: TASK: Include the task 'show_interfaces.yml' 51385 1727204609.39453: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.39458: getting variables 51385 1727204609.39459: in VariableManager get_vars() 51385 1727204609.39504: Calling all_inventory to load vars for managed-node1 51385 1727204609.39507: Calling groups_inventory to load vars for managed-node1 51385 1727204609.39510: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.39516: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000093d 51385 1727204609.39531: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.39534: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.39538: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.40058: WORKER PROCESS EXITING 51385 1727204609.41174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.42998: done with get_vars() 51385 1727204609.43021: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.053) 0:00:27.835 ***** 51385 1727204609.43122: entering _queue_task() for managed-node1/include_tasks 51385 1727204609.43446: worker is 1 (out of 1 available) 51385 1727204609.43460: exiting _queue_task() for managed-node1/include_tasks 51385 1727204609.43473: done queuing things up, now waiting for results queue to drain 51385 1727204609.43475: waiting for pending results... 51385 1727204609.43768: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 51385 1727204609.43854: in run() - task 0affcd87-79f5-6b1f-5706-00000000093e 51385 1727204609.43868: variable 'ansible_search_path' from source: unknown 51385 1727204609.43873: variable 'ansible_search_path' from source: unknown 51385 1727204609.43905: calling self._execute() 51385 1727204609.44002: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.44006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.44015: variable 'omit' from source: magic vars 51385 1727204609.44395: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.44407: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.44414: _execute() done 51385 1727204609.44417: dumping result to json 51385 1727204609.44420: done dumping result, returning 51385 1727204609.44426: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-6b1f-5706-00000000093e] 51385 1727204609.44433: sending task result for task 0affcd87-79f5-6b1f-5706-00000000093e 51385 1727204609.44527: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000093e 51385 1727204609.44530: WORKER PROCESS EXITING 51385 1727204609.44591: no more pending results, returning what we have 51385 1727204609.44597: in VariableManager get_vars() 51385 1727204609.44647: Calling all_inventory to load vars for managed-node1 51385 1727204609.44650: Calling groups_inventory to load vars for managed-node1 51385 1727204609.44653: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.44672: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.44676: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.44680: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.46259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.47889: done with get_vars() 51385 1727204609.47916: variable 'ansible_search_path' from source: unknown 51385 1727204609.47918: variable 'ansible_search_path' from source: unknown 51385 1727204609.47957: we have included files to process 51385 1727204609.47959: generating all_blocks data 51385 1727204609.47960: done generating all_blocks data 51385 1727204609.47968: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204609.47969: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204609.47971: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 51385 1727204609.48092: in VariableManager get_vars() 51385 1727204609.48121: done with get_vars() 51385 1727204609.48239: done processing included file 51385 1727204609.48241: iterating over new_blocks loaded from include file 51385 1727204609.48242: in VariableManager get_vars() 51385 1727204609.48261: done with get_vars() 51385 1727204609.48262: filtering new block on tags 51385 1727204609.48284: done filtering new block on tags 51385 1727204609.48287: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 51385 1727204609.48293: extending task lists for all hosts with included blocks 51385 1727204609.48703: done extending task lists 51385 1727204609.48704: done processing included files 51385 1727204609.48705: results queue empty 51385 1727204609.48706: checking for any_errors_fatal 51385 1727204609.48710: done checking for any_errors_fatal 51385 1727204609.48711: checking for max_fail_percentage 51385 1727204609.48712: done checking for max_fail_percentage 51385 1727204609.48713: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.48713: done checking to see if all hosts have failed 51385 1727204609.48714: getting the remaining hosts for this loop 51385 1727204609.48715: done getting the remaining hosts for this loop 51385 1727204609.48717: getting the next task for host managed-node1 51385 1727204609.48722: done getting next task for host managed-node1 51385 1727204609.48724: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 51385 1727204609.48727: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.48730: getting variables 51385 1727204609.48731: in VariableManager get_vars() 51385 1727204609.48744: Calling all_inventory to load vars for managed-node1 51385 1727204609.48746: Calling groups_inventory to load vars for managed-node1 51385 1727204609.48748: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.48754: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.48756: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.48759: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.50072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.51702: done with get_vars() 51385 1727204609.51732: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.086) 0:00:27.922 ***** 51385 1727204609.51820: entering _queue_task() for managed-node1/include_tasks 51385 1727204609.52158: worker is 1 (out of 1 available) 51385 1727204609.52174: exiting _queue_task() for managed-node1/include_tasks 51385 1727204609.52186: done queuing things up, now waiting for results queue to drain 51385 1727204609.52188: waiting for pending results... 51385 1727204609.52479: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 51385 1727204609.52578: in run() - task 0affcd87-79f5-6b1f-5706-000000000aa0 51385 1727204609.52589: variable 'ansible_search_path' from source: unknown 51385 1727204609.52594: variable 'ansible_search_path' from source: unknown 51385 1727204609.52632: calling self._execute() 51385 1727204609.52729: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.52738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.52750: variable 'omit' from source: magic vars 51385 1727204609.53134: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.53146: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.53152: _execute() done 51385 1727204609.53155: dumping result to json 51385 1727204609.53158: done dumping result, returning 51385 1727204609.53167: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-6b1f-5706-000000000aa0] 51385 1727204609.53177: sending task result for task 0affcd87-79f5-6b1f-5706-000000000aa0 51385 1727204609.53305: no more pending results, returning what we have 51385 1727204609.53312: in VariableManager get_vars() 51385 1727204609.53362: Calling all_inventory to load vars for managed-node1 51385 1727204609.53367: Calling groups_inventory to load vars for managed-node1 51385 1727204609.53369: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.53386: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.53389: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.53392: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.54483: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000aa0 51385 1727204609.54487: WORKER PROCESS EXITING 51385 1727204609.55087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.56678: done with get_vars() 51385 1727204609.56705: variable 'ansible_search_path' from source: unknown 51385 1727204609.56707: variable 'ansible_search_path' from source: unknown 51385 1727204609.56771: we have included files to process 51385 1727204609.56773: generating all_blocks data 51385 1727204609.56774: done generating all_blocks data 51385 1727204609.56776: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204609.56777: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204609.56779: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 51385 1727204609.57056: done processing included file 51385 1727204609.57058: iterating over new_blocks loaded from include file 51385 1727204609.57060: in VariableManager get_vars() 51385 1727204609.57084: done with get_vars() 51385 1727204609.57086: filtering new block on tags 51385 1727204609.57105: done filtering new block on tags 51385 1727204609.57108: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 51385 1727204609.57114: extending task lists for all hosts with included blocks 51385 1727204609.57276: done extending task lists 51385 1727204609.57277: done processing included files 51385 1727204609.57278: results queue empty 51385 1727204609.57279: checking for any_errors_fatal 51385 1727204609.57282: done checking for any_errors_fatal 51385 1727204609.57283: checking for max_fail_percentage 51385 1727204609.57284: done checking for max_fail_percentage 51385 1727204609.57285: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.57286: done checking to see if all hosts have failed 51385 1727204609.57287: getting the remaining hosts for this loop 51385 1727204609.57288: done getting the remaining hosts for this loop 51385 1727204609.57291: getting the next task for host managed-node1 51385 1727204609.57295: done getting next task for host managed-node1 51385 1727204609.57298: ^ task is: TASK: Gather current interface info 51385 1727204609.57301: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.57304: getting variables 51385 1727204609.57304: in VariableManager get_vars() 51385 1727204609.57319: Calling all_inventory to load vars for managed-node1 51385 1727204609.57321: Calling groups_inventory to load vars for managed-node1 51385 1727204609.57323: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.57329: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.57332: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.57335: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.58596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204609.60220: done with get_vars() 51385 1727204609.60245: done getting variables 51385 1727204609.60292: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.085) 0:00:28.007 ***** 51385 1727204609.60324: entering _queue_task() for managed-node1/command 51385 1727204609.60644: worker is 1 (out of 1 available) 51385 1727204609.60657: exiting _queue_task() for managed-node1/command 51385 1727204609.60671: done queuing things up, now waiting for results queue to drain 51385 1727204609.60672: waiting for pending results... 51385 1727204609.60952: running TaskExecutor() for managed-node1/TASK: Gather current interface info 51385 1727204609.61086: in run() - task 0affcd87-79f5-6b1f-5706-000000000ad7 51385 1727204609.61103: variable 'ansible_search_path' from source: unknown 51385 1727204609.61114: variable 'ansible_search_path' from source: unknown 51385 1727204609.61152: calling self._execute() 51385 1727204609.61250: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.61261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.61303: variable 'omit' from source: magic vars 51385 1727204609.61685: variable 'ansible_distribution_major_version' from source: facts 51385 1727204609.61703: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204609.61715: variable 'omit' from source: magic vars 51385 1727204609.61783: variable 'omit' from source: magic vars 51385 1727204609.61823: variable 'omit' from source: magic vars 51385 1727204609.61875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204609.61917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204609.61945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204609.61966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204609.61986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204609.62016: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204609.62022: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.62027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.62121: Set connection var ansible_pipelining to False 51385 1727204609.62129: Set connection var ansible_shell_type to sh 51385 1727204609.62140: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204609.62150: Set connection var ansible_timeout to 10 51385 1727204609.62155: Set connection var ansible_connection to ssh 51385 1727204609.62163: Set connection var ansible_shell_executable to /bin/sh 51385 1727204609.62195: variable 'ansible_shell_executable' from source: unknown 51385 1727204609.62206: variable 'ansible_connection' from source: unknown 51385 1727204609.62214: variable 'ansible_module_compression' from source: unknown 51385 1727204609.62221: variable 'ansible_shell_type' from source: unknown 51385 1727204609.62228: variable 'ansible_shell_executable' from source: unknown 51385 1727204609.62234: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204609.62242: variable 'ansible_pipelining' from source: unknown 51385 1727204609.62249: variable 'ansible_timeout' from source: unknown 51385 1727204609.62257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204609.62406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204609.62429: variable 'omit' from source: magic vars 51385 1727204609.62442: starting attempt loop 51385 1727204609.62449: running the handler 51385 1727204609.62473: _low_level_execute_command(): starting 51385 1727204609.62487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204609.63275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204609.63295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.63311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.63332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.63379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.63392: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.63410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.63430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.63442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.63453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204609.63467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.63483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.63500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.63515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.63529: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204609.63545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.63621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.63651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.63674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.63769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.65341: stdout chunk (state=3): >>>/root <<< 51385 1727204609.65476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.65500: stderr chunk (state=3): >>><<< 51385 1727204609.65503: stdout chunk (state=3): >>><<< 51385 1727204609.65522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.65533: _low_level_execute_command(): starting 51385 1727204609.65539: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420 `" && echo ansible-tmp-1727204609.65523-53219-43346422776420="` echo /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420 `" ) && sleep 0' 51385 1727204609.65979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.65985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.66016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.66024: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.66034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.66044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.66052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.66057: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.66071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.66076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204609.66082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.66132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.66147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.66150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.66226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.68070: stdout chunk (state=3): >>>ansible-tmp-1727204609.65523-53219-43346422776420=/root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420 <<< 51385 1727204609.68180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.68223: stderr chunk (state=3): >>><<< 51385 1727204609.68226: stdout chunk (state=3): >>><<< 51385 1727204609.68239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204609.65523-53219-43346422776420=/root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.68266: variable 'ansible_module_compression' from source: unknown 51385 1727204609.68309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204609.68337: variable 'ansible_facts' from source: unknown 51385 1727204609.68401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/AnsiballZ_command.py 51385 1727204609.68502: Sending initial data 51385 1727204609.68505: Sent initial data (153 bytes) 51385 1727204609.69850: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204609.69868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.69894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.69913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.69955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.69973: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.70022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.70043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.70056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.70075: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204609.70090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.70113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.70131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.70143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.70155: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204609.70173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.70255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.70277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.70292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.70453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.72097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204609.72142: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204609.72191: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpfrdzi1pe /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/AnsiballZ_command.py <<< 51385 1727204609.72237: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204609.73099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.73304: stderr chunk (state=3): >>><<< 51385 1727204609.73307: stdout chunk (state=3): >>><<< 51385 1727204609.73309: done transferring module to remote 51385 1727204609.73311: _low_level_execute_command(): starting 51385 1727204609.73314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/ /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/AnsiballZ_command.py && sleep 0' 51385 1727204609.73974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204609.73988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.74004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.74032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.74078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.74090: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.74104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.74128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.74145: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.74158: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204609.74185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.74188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.74252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.74258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.74314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.76004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.76046: stderr chunk (state=3): >>><<< 51385 1727204609.76050: stdout chunk (state=3): >>><<< 51385 1727204609.76066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.76070: _low_level_execute_command(): starting 51385 1727204609.76072: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/AnsiballZ_command.py && sleep 0' 51385 1727204609.76507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.76513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.76550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.76555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.76567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.76578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.76583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.76648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.76654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.76658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.76754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.90029: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:29.896699", "end": "2024-09-24 15:03:29.899635", "delta": "0:00:00.002936", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204609.91204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204609.91248: stderr chunk (state=3): >>><<< 51385 1727204609.91252: stdout chunk (state=3): >>><<< 51385 1727204609.91401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:29.896699", "end": "2024-09-24 15:03:29.899635", "delta": "0:00:00.002936", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204609.91406: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204609.91415: _low_level_execute_command(): starting 51385 1727204609.91417: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204609.65523-53219-43346422776420/ > /dev/null 2>&1 && sleep 0' 51385 1727204609.91993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204609.92007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.92021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.92041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.92086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.92098: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204609.92113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.92130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204609.92142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204609.92152: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204609.92163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204609.92179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204609.92194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204609.92204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204609.92339: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204609.92359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204609.92463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204609.92498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204609.92501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204609.92588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204609.94380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204609.94433: stderr chunk (state=3): >>><<< 51385 1727204609.94436: stdout chunk (state=3): >>><<< 51385 1727204609.94770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204609.94773: handler run complete 51385 1727204609.94776: Evaluated conditional (False): False 51385 1727204609.94778: attempt loop complete, returning result 51385 1727204609.94780: _execute() done 51385 1727204609.94782: dumping result to json 51385 1727204609.94784: done dumping result, returning 51385 1727204609.94786: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-6b1f-5706-000000000ad7] 51385 1727204609.94788: sending task result for task 0affcd87-79f5-6b1f-5706-000000000ad7 51385 1727204609.94862: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000ad7 51385 1727204609.94867: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002936", "end": "2024-09-24 15:03:29.899635", "rc": 0, "start": "2024-09-24 15:03:29.896699" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 rpltstbr 51385 1727204609.94940: no more pending results, returning what we have 51385 1727204609.94944: results queue empty 51385 1727204609.94945: checking for any_errors_fatal 51385 1727204609.94946: done checking for any_errors_fatal 51385 1727204609.94947: checking for max_fail_percentage 51385 1727204609.94949: done checking for max_fail_percentage 51385 1727204609.94950: checking to see if all hosts have failed and the running result is not ok 51385 1727204609.94951: done checking to see if all hosts have failed 51385 1727204609.94952: getting the remaining hosts for this loop 51385 1727204609.94953: done getting the remaining hosts for this loop 51385 1727204609.94957: getting the next task for host managed-node1 51385 1727204609.94965: done getting next task for host managed-node1 51385 1727204609.94967: ^ task is: TASK: Set current_interfaces 51385 1727204609.94973: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204609.94977: getting variables 51385 1727204609.94978: in VariableManager get_vars() 51385 1727204609.95017: Calling all_inventory to load vars for managed-node1 51385 1727204609.95020: Calling groups_inventory to load vars for managed-node1 51385 1727204609.95022: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204609.95034: Calling all_plugins_play to load vars for managed-node1 51385 1727204609.95037: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204609.95041: Calling groups_plugins_play to load vars for managed-node1 51385 1727204609.98359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204610.13952: done with get_vars() 51385 1727204610.13989: done getting variables 51385 1727204610.14042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.537) 0:00:28.544 ***** 51385 1727204610.14083: entering _queue_task() for managed-node1/set_fact 51385 1727204610.14826: worker is 1 (out of 1 available) 51385 1727204610.14839: exiting _queue_task() for managed-node1/set_fact 51385 1727204610.14852: done queuing things up, now waiting for results queue to drain 51385 1727204610.14853: waiting for pending results... 51385 1727204610.15577: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 51385 1727204610.15749: in run() - task 0affcd87-79f5-6b1f-5706-000000000ad8 51385 1727204610.15781: variable 'ansible_search_path' from source: unknown 51385 1727204610.15789: variable 'ansible_search_path' from source: unknown 51385 1727204610.15841: calling self._execute() 51385 1727204610.15958: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.15974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.15994: variable 'omit' from source: magic vars 51385 1727204610.16498: variable 'ansible_distribution_major_version' from source: facts 51385 1727204610.16517: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204610.16533: variable 'omit' from source: magic vars 51385 1727204610.16595: variable 'omit' from source: magic vars 51385 1727204610.16719: variable '_current_interfaces' from source: set_fact 51385 1727204610.16801: variable 'omit' from source: magic vars 51385 1727204610.16868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204610.16910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204610.16984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204610.17007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.17356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.17400: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204610.17410: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.17424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.17549: Set connection var ansible_pipelining to False 51385 1727204610.17574: Set connection var ansible_shell_type to sh 51385 1727204610.17599: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204610.17612: Set connection var ansible_timeout to 10 51385 1727204610.17626: Set connection var ansible_connection to ssh 51385 1727204610.17638: Set connection var ansible_shell_executable to /bin/sh 51385 1727204610.17674: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.17686: variable 'ansible_connection' from source: unknown 51385 1727204610.17693: variable 'ansible_module_compression' from source: unknown 51385 1727204610.17700: variable 'ansible_shell_type' from source: unknown 51385 1727204610.17706: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.17713: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.17731: variable 'ansible_pipelining' from source: unknown 51385 1727204610.17798: variable 'ansible_timeout' from source: unknown 51385 1727204610.17899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.18167: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204610.18238: variable 'omit' from source: magic vars 51385 1727204610.18340: starting attempt loop 51385 1727204610.18348: running the handler 51385 1727204610.18363: handler run complete 51385 1727204610.18382: attempt loop complete, returning result 51385 1727204610.18388: _execute() done 51385 1727204610.18397: dumping result to json 51385 1727204610.18405: done dumping result, returning 51385 1727204610.18415: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-6b1f-5706-000000000ad8] 51385 1727204610.18426: sending task result for task 0affcd87-79f5-6b1f-5706-000000000ad8 ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101", "rpltstbr" ] }, "changed": false } 51385 1727204610.18602: no more pending results, returning what we have 51385 1727204610.18605: results queue empty 51385 1727204610.18606: checking for any_errors_fatal 51385 1727204610.18616: done checking for any_errors_fatal 51385 1727204610.18617: checking for max_fail_percentage 51385 1727204610.18619: done checking for max_fail_percentage 51385 1727204610.18620: checking to see if all hosts have failed and the running result is not ok 51385 1727204610.18621: done checking to see if all hosts have failed 51385 1727204610.18622: getting the remaining hosts for this loop 51385 1727204610.18624: done getting the remaining hosts for this loop 51385 1727204610.18628: getting the next task for host managed-node1 51385 1727204610.18637: done getting next task for host managed-node1 51385 1727204610.18640: ^ task is: TASK: Show current_interfaces 51385 1727204610.18644: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204610.18648: getting variables 51385 1727204610.18650: in VariableManager get_vars() 51385 1727204610.18702: Calling all_inventory to load vars for managed-node1 51385 1727204610.18706: Calling groups_inventory to load vars for managed-node1 51385 1727204610.18708: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204610.18720: Calling all_plugins_play to load vars for managed-node1 51385 1727204610.18723: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204610.18726: Calling groups_plugins_play to load vars for managed-node1 51385 1727204610.19559: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000ad8 51385 1727204610.19563: WORKER PROCESS EXITING 51385 1727204610.20601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204610.22986: done with get_vars() 51385 1727204610.23017: done getting variables 51385 1727204610.23091: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.090) 0:00:28.635 ***** 51385 1727204610.23128: entering _queue_task() for managed-node1/debug 51385 1727204610.23538: worker is 1 (out of 1 available) 51385 1727204610.23550: exiting _queue_task() for managed-node1/debug 51385 1727204610.23574: done queuing things up, now waiting for results queue to drain 51385 1727204610.23576: waiting for pending results... 51385 1727204610.24455: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 51385 1727204610.24882: in run() - task 0affcd87-79f5-6b1f-5706-000000000aa1 51385 1727204610.25021: variable 'ansible_search_path' from source: unknown 51385 1727204610.25029: variable 'ansible_search_path' from source: unknown 51385 1727204610.25101: calling self._execute() 51385 1727204610.25469: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.25630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.25671: variable 'omit' from source: magic vars 51385 1727204610.26133: variable 'ansible_distribution_major_version' from source: facts 51385 1727204610.26145: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204610.26151: variable 'omit' from source: magic vars 51385 1727204610.26189: variable 'omit' from source: magic vars 51385 1727204610.26262: variable 'current_interfaces' from source: set_fact 51385 1727204610.26290: variable 'omit' from source: magic vars 51385 1727204610.26324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204610.26352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204610.26374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204610.26388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.26398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.26422: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204610.26425: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.26428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.26506: Set connection var ansible_pipelining to False 51385 1727204610.26510: Set connection var ansible_shell_type to sh 51385 1727204610.26516: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204610.26523: Set connection var ansible_timeout to 10 51385 1727204610.26526: Set connection var ansible_connection to ssh 51385 1727204610.26531: Set connection var ansible_shell_executable to /bin/sh 51385 1727204610.26549: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.26552: variable 'ansible_connection' from source: unknown 51385 1727204610.26557: variable 'ansible_module_compression' from source: unknown 51385 1727204610.26559: variable 'ansible_shell_type' from source: unknown 51385 1727204610.26561: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.26565: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.26567: variable 'ansible_pipelining' from source: unknown 51385 1727204610.26571: variable 'ansible_timeout' from source: unknown 51385 1727204610.26576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.26678: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204610.26690: variable 'omit' from source: magic vars 51385 1727204610.26693: starting attempt loop 51385 1727204610.26696: running the handler 51385 1727204610.26732: handler run complete 51385 1727204610.26742: attempt loop complete, returning result 51385 1727204610.26745: _execute() done 51385 1727204610.26747: dumping result to json 51385 1727204610.26750: done dumping result, returning 51385 1727204610.26757: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-6b1f-5706-000000000aa1] 51385 1727204610.26766: sending task result for task 0affcd87-79f5-6b1f-5706-000000000aa1 51385 1727204610.26849: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000aa1 51385 1727204610.26851: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101', 'rpltstbr'] 51385 1727204610.26899: no more pending results, returning what we have 51385 1727204610.26902: results queue empty 51385 1727204610.26903: checking for any_errors_fatal 51385 1727204610.26909: done checking for any_errors_fatal 51385 1727204610.26909: checking for max_fail_percentage 51385 1727204610.26911: done checking for max_fail_percentage 51385 1727204610.26912: checking to see if all hosts have failed and the running result is not ok 51385 1727204610.26913: done checking to see if all hosts have failed 51385 1727204610.26913: getting the remaining hosts for this loop 51385 1727204610.26915: done getting the remaining hosts for this loop 51385 1727204610.26919: getting the next task for host managed-node1 51385 1727204610.26927: done getting next task for host managed-node1 51385 1727204610.26929: ^ task is: TASK: Install iproute 51385 1727204610.26932: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204610.26936: getting variables 51385 1727204610.26938: in VariableManager get_vars() 51385 1727204610.26983: Calling all_inventory to load vars for managed-node1 51385 1727204610.26986: Calling groups_inventory to load vars for managed-node1 51385 1727204610.26988: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204610.26998: Calling all_plugins_play to load vars for managed-node1 51385 1727204610.27001: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204610.27003: Calling groups_plugins_play to load vars for managed-node1 51385 1727204610.28442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204610.30736: done with get_vars() 51385 1727204610.30757: done getting variables 51385 1727204610.30806: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.077) 0:00:28.712 ***** 51385 1727204610.30830: entering _queue_task() for managed-node1/package 51385 1727204610.31067: worker is 1 (out of 1 available) 51385 1727204610.31083: exiting _queue_task() for managed-node1/package 51385 1727204610.31094: done queuing things up, now waiting for results queue to drain 51385 1727204610.31095: waiting for pending results... 51385 1727204610.31286: running TaskExecutor() for managed-node1/TASK: Install iproute 51385 1727204610.31362: in run() - task 0affcd87-79f5-6b1f-5706-00000000093f 51385 1727204610.31376: variable 'ansible_search_path' from source: unknown 51385 1727204610.31379: variable 'ansible_search_path' from source: unknown 51385 1727204610.31411: calling self._execute() 51385 1727204610.31494: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.31498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.31507: variable 'omit' from source: magic vars 51385 1727204610.31814: variable 'ansible_distribution_major_version' from source: facts 51385 1727204610.31825: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204610.31829: variable 'omit' from source: magic vars 51385 1727204610.31896: variable 'omit' from source: magic vars 51385 1727204610.32133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51385 1727204610.34969: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51385 1727204610.35022: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51385 1727204610.35051: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51385 1727204610.35082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51385 1727204610.35101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51385 1727204610.35174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51385 1727204610.35194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51385 1727204610.35211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51385 1727204610.35237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51385 1727204610.35248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51385 1727204610.35330: variable '__network_is_ostree' from source: set_fact 51385 1727204610.35334: variable 'omit' from source: magic vars 51385 1727204610.35358: variable 'omit' from source: magic vars 51385 1727204610.35385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204610.35406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204610.35421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204610.35433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.35441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204610.35468: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204610.35471: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.35476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.35542: Set connection var ansible_pipelining to False 51385 1727204610.35546: Set connection var ansible_shell_type to sh 51385 1727204610.35554: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204610.35566: Set connection var ansible_timeout to 10 51385 1727204610.35569: Set connection var ansible_connection to ssh 51385 1727204610.35573: Set connection var ansible_shell_executable to /bin/sh 51385 1727204610.35592: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.35597: variable 'ansible_connection' from source: unknown 51385 1727204610.35599: variable 'ansible_module_compression' from source: unknown 51385 1727204610.35601: variable 'ansible_shell_type' from source: unknown 51385 1727204610.35603: variable 'ansible_shell_executable' from source: unknown 51385 1727204610.35605: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204610.35608: variable 'ansible_pipelining' from source: unknown 51385 1727204610.35611: variable 'ansible_timeout' from source: unknown 51385 1727204610.35616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204610.35686: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204610.35695: variable 'omit' from source: magic vars 51385 1727204610.35698: starting attempt loop 51385 1727204610.35701: running the handler 51385 1727204610.35708: variable 'ansible_facts' from source: unknown 51385 1727204610.35710: variable 'ansible_facts' from source: unknown 51385 1727204610.35738: _low_level_execute_command(): starting 51385 1727204610.35744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204610.36213: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204610.36223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204610.36333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.36399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204610.36417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204610.36489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204610.38066: stdout chunk (state=3): >>>/root <<< 51385 1727204610.38176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204610.38228: stderr chunk (state=3): >>><<< 51385 1727204610.38231: stdout chunk (state=3): >>><<< 51385 1727204610.38267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204610.38277: _low_level_execute_command(): starting 51385 1727204610.38281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627 `" && echo ansible-tmp-1727204610.3824427-53260-103055590822627="` echo /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627 `" ) && sleep 0' 51385 1727204610.38713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204610.38717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204610.38746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.38750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204610.38752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.38806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204610.38810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204610.38873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204610.40703: stdout chunk (state=3): >>>ansible-tmp-1727204610.3824427-53260-103055590822627=/root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627 <<< 51385 1727204610.40817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204610.40872: stderr chunk (state=3): >>><<< 51385 1727204610.40875: stdout chunk (state=3): >>><<< 51385 1727204610.40884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204610.3824427-53260-103055590822627=/root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204610.40914: variable 'ansible_module_compression' from source: unknown 51385 1727204610.40965: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 51385 1727204610.40994: variable 'ansible_facts' from source: unknown 51385 1727204610.41067: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/AnsiballZ_dnf.py 51385 1727204610.41172: Sending initial data 51385 1727204610.41176: Sent initial data (152 bytes) 51385 1727204610.41832: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204610.41836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204610.41876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.41879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204610.41887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.41929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204610.41934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204610.41998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204610.43696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204610.43747: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204610.43798: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpisz1q35a /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/AnsiballZ_dnf.py <<< 51385 1727204610.43852: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204610.44923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204610.45028: stderr chunk (state=3): >>><<< 51385 1727204610.45031: stdout chunk (state=3): >>><<< 51385 1727204610.45047: done transferring module to remote 51385 1727204610.45057: _low_level_execute_command(): starting 51385 1727204610.45065: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/ /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/AnsiballZ_dnf.py && sleep 0' 51385 1727204610.45507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204610.45513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204610.45543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.45571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.45623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204610.45629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204610.45693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204610.47379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204610.47428: stderr chunk (state=3): >>><<< 51385 1727204610.47431: stdout chunk (state=3): >>><<< 51385 1727204610.47444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204610.47448: _low_level_execute_command(): starting 51385 1727204610.47451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/AnsiballZ_dnf.py && sleep 0' 51385 1727204610.47889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204610.47894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204610.47928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.47940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204610.47950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204610.47998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204610.48010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204610.48079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.39566: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 51385 1727204611.43906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204611.43910: stdout chunk (state=3): >>><<< 51385 1727204611.43913: stderr chunk (state=3): >>><<< 51385 1727204611.43971: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204611.44078: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204611.44082: _low_level_execute_command(): starting 51385 1727204611.44084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204610.3824427-53260-103055590822627/ > /dev/null 2>&1 && sleep 0' 51385 1727204611.44697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204611.44751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.44754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.44794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204611.44798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.44801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.44912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.44970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.45020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.46830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.46935: stderr chunk (state=3): >>><<< 51385 1727204611.46938: stdout chunk (state=3): >>><<< 51385 1727204611.47173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204611.47176: handler run complete 51385 1727204611.47179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51385 1727204611.47362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51385 1727204611.47420: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51385 1727204611.47455: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51385 1727204611.47491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51385 1727204611.47647: variable '__install_status' from source: set_fact 51385 1727204611.47699: Evaluated conditional (__install_status is success): True 51385 1727204611.47743: attempt loop complete, returning result 51385 1727204611.47749: _execute() done 51385 1727204611.47755: dumping result to json 51385 1727204611.47767: done dumping result, returning 51385 1727204611.47779: done running TaskExecutor() for managed-node1/TASK: Install iproute [0affcd87-79f5-6b1f-5706-00000000093f] 51385 1727204611.47790: sending task result for task 0affcd87-79f5-6b1f-5706-00000000093f 51385 1727204611.47946: done sending task result for task 0affcd87-79f5-6b1f-5706-00000000093f 51385 1727204611.47948: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 51385 1727204611.48041: no more pending results, returning what we have 51385 1727204611.48044: results queue empty 51385 1727204611.48046: checking for any_errors_fatal 51385 1727204611.48052: done checking for any_errors_fatal 51385 1727204611.48052: checking for max_fail_percentage 51385 1727204611.48054: done checking for max_fail_percentage 51385 1727204611.48055: checking to see if all hosts have failed and the running result is not ok 51385 1727204611.48056: done checking to see if all hosts have failed 51385 1727204611.48057: getting the remaining hosts for this loop 51385 1727204611.48058: done getting the remaining hosts for this loop 51385 1727204611.48062: getting the next task for host managed-node1 51385 1727204611.48070: done getting next task for host managed-node1 51385 1727204611.48072: ^ task is: TASK: Create veth interface {{ interface }} 51385 1727204611.48075: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204611.48079: getting variables 51385 1727204611.48080: in VariableManager get_vars() 51385 1727204611.48120: Calling all_inventory to load vars for managed-node1 51385 1727204611.48123: Calling groups_inventory to load vars for managed-node1 51385 1727204611.48125: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204611.48135: Calling all_plugins_play to load vars for managed-node1 51385 1727204611.48137: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204611.48139: Calling groups_plugins_play to load vars for managed-node1 51385 1727204611.48989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204611.50047: done with get_vars() 51385 1727204611.50080: done getting variables 51385 1727204611.50142: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204611.50281: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:03:31 -0400 (0:00:01.194) 0:00:29.907 ***** 51385 1727204611.50325: entering _queue_task() for managed-node1/command 51385 1727204611.50665: worker is 1 (out of 1 available) 51385 1727204611.50677: exiting _queue_task() for managed-node1/command 51385 1727204611.50687: done queuing things up, now waiting for results queue to drain 51385 1727204611.50689: waiting for pending results... 51385 1727204611.51007: running TaskExecutor() for managed-node1/TASK: Create veth interface lsr101 51385 1727204611.51127: in run() - task 0affcd87-79f5-6b1f-5706-000000000940 51385 1727204611.51150: variable 'ansible_search_path' from source: unknown 51385 1727204611.51157: variable 'ansible_search_path' from source: unknown 51385 1727204611.51471: variable 'interface' from source: play vars 51385 1727204611.51571: variable 'interface' from source: play vars 51385 1727204611.51659: variable 'interface' from source: play vars 51385 1727204611.51831: Loaded config def from plugin (lookup/items) 51385 1727204611.51843: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 51385 1727204611.51861: variable 'omit' from source: magic vars 51385 1727204611.51982: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.51989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.51998: variable 'omit' from source: magic vars 51385 1727204611.52172: variable 'ansible_distribution_major_version' from source: facts 51385 1727204611.52178: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204611.52313: variable 'type' from source: play vars 51385 1727204611.52316: variable 'state' from source: include params 51385 1727204611.52319: variable 'interface' from source: play vars 51385 1727204611.52324: variable 'current_interfaces' from source: set_fact 51385 1727204611.52331: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 51385 1727204611.52333: when evaluation is False, skipping this task 51385 1727204611.52357: variable 'item' from source: unknown 51385 1727204611.52413: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 51385 1727204611.52566: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.52569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.52572: variable 'omit' from source: magic vars 51385 1727204611.52636: variable 'ansible_distribution_major_version' from source: facts 51385 1727204611.52640: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204611.52761: variable 'type' from source: play vars 51385 1727204611.52769: variable 'state' from source: include params 51385 1727204611.52772: variable 'interface' from source: play vars 51385 1727204611.52777: variable 'current_interfaces' from source: set_fact 51385 1727204611.52782: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 51385 1727204611.52785: when evaluation is False, skipping this task 51385 1727204611.52809: variable 'item' from source: unknown 51385 1727204611.52852: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 51385 1727204611.52935: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.52938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.52941: variable 'omit' from source: magic vars 51385 1727204611.53031: variable 'ansible_distribution_major_version' from source: facts 51385 1727204611.53034: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204611.53153: variable 'type' from source: play vars 51385 1727204611.53156: variable 'state' from source: include params 51385 1727204611.53159: variable 'interface' from source: play vars 51385 1727204611.53161: variable 'current_interfaces' from source: set_fact 51385 1727204611.53171: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 51385 1727204611.53173: when evaluation is False, skipping this task 51385 1727204611.53191: variable 'item' from source: unknown 51385 1727204611.53233: variable 'item' from source: unknown skipping: [managed-node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 51385 1727204611.53307: dumping result to json 51385 1727204611.53310: done dumping result, returning 51385 1727204611.53312: done running TaskExecutor() for managed-node1/TASK: Create veth interface lsr101 [0affcd87-79f5-6b1f-5706-000000000940] 51385 1727204611.53314: sending task result for task 0affcd87-79f5-6b1f-5706-000000000940 51385 1727204611.53355: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000940 51385 1727204611.53357: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false } MSG: All items skipped 51385 1727204611.53448: no more pending results, returning what we have 51385 1727204611.53451: results queue empty 51385 1727204611.53452: checking for any_errors_fatal 51385 1727204611.53458: done checking for any_errors_fatal 51385 1727204611.53459: checking for max_fail_percentage 51385 1727204611.53460: done checking for max_fail_percentage 51385 1727204611.53461: checking to see if all hosts have failed and the running result is not ok 51385 1727204611.53462: done checking to see if all hosts have failed 51385 1727204611.53463: getting the remaining hosts for this loop 51385 1727204611.53465: done getting the remaining hosts for this loop 51385 1727204611.53468: getting the next task for host managed-node1 51385 1727204611.53473: done getting next task for host managed-node1 51385 1727204611.53475: ^ task is: TASK: Set up veth as managed by NetworkManager 51385 1727204611.53478: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204611.53481: getting variables 51385 1727204611.53483: in VariableManager get_vars() 51385 1727204611.53539: Calling all_inventory to load vars for managed-node1 51385 1727204611.53543: Calling groups_inventory to load vars for managed-node1 51385 1727204611.53546: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204611.53562: Calling all_plugins_play to load vars for managed-node1 51385 1727204611.53575: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204611.53580: Calling groups_plugins_play to load vars for managed-node1 51385 1727204611.55100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204611.56452: done with get_vars() 51385 1727204611.56474: done getting variables 51385 1727204611.56518: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.062) 0:00:29.969 ***** 51385 1727204611.56547: entering _queue_task() for managed-node1/command 51385 1727204611.56777: worker is 1 (out of 1 available) 51385 1727204611.56790: exiting _queue_task() for managed-node1/command 51385 1727204611.56802: done queuing things up, now waiting for results queue to drain 51385 1727204611.56803: waiting for pending results... 51385 1727204611.56994: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 51385 1727204611.57062: in run() - task 0affcd87-79f5-6b1f-5706-000000000941 51385 1727204611.57079: variable 'ansible_search_path' from source: unknown 51385 1727204611.57082: variable 'ansible_search_path' from source: unknown 51385 1727204611.57111: calling self._execute() 51385 1727204611.57192: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.57196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.57205: variable 'omit' from source: magic vars 51385 1727204611.57494: variable 'ansible_distribution_major_version' from source: facts 51385 1727204611.57506: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204611.57616: variable 'type' from source: play vars 51385 1727204611.57619: variable 'state' from source: include params 51385 1727204611.57622: Evaluated conditional (type == 'veth' and state == 'present'): False 51385 1727204611.57624: when evaluation is False, skipping this task 51385 1727204611.57629: _execute() done 51385 1727204611.57633: dumping result to json 51385 1727204611.57635: done dumping result, returning 51385 1727204611.57642: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-6b1f-5706-000000000941] 51385 1727204611.57648: sending task result for task 0affcd87-79f5-6b1f-5706-000000000941 51385 1727204611.57734: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000941 51385 1727204611.57737: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 51385 1727204611.57790: no more pending results, returning what we have 51385 1727204611.57794: results queue empty 51385 1727204611.57795: checking for any_errors_fatal 51385 1727204611.57807: done checking for any_errors_fatal 51385 1727204611.57810: checking for max_fail_percentage 51385 1727204611.57812: done checking for max_fail_percentage 51385 1727204611.57813: checking to see if all hosts have failed and the running result is not ok 51385 1727204611.57814: done checking to see if all hosts have failed 51385 1727204611.57814: getting the remaining hosts for this loop 51385 1727204611.57816: done getting the remaining hosts for this loop 51385 1727204611.57820: getting the next task for host managed-node1 51385 1727204611.57830: done getting next task for host managed-node1 51385 1727204611.57833: ^ task is: TASK: Delete veth interface {{ interface }} 51385 1727204611.57837: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204611.57843: getting variables 51385 1727204611.57849: in VariableManager get_vars() 51385 1727204611.57901: Calling all_inventory to load vars for managed-node1 51385 1727204611.57904: Calling groups_inventory to load vars for managed-node1 51385 1727204611.57906: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204611.57918: Calling all_plugins_play to load vars for managed-node1 51385 1727204611.57921: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204611.57924: Calling groups_plugins_play to load vars for managed-node1 51385 1727204611.59353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204611.60407: done with get_vars() 51385 1727204611.60422: done getting variables 51385 1727204611.60472: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204611.60556: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.040) 0:00:30.009 ***** 51385 1727204611.60583: entering _queue_task() for managed-node1/command 51385 1727204611.60803: worker is 1 (out of 1 available) 51385 1727204611.60817: exiting _queue_task() for managed-node1/command 51385 1727204611.60828: done queuing things up, now waiting for results queue to drain 51385 1727204611.60829: waiting for pending results... 51385 1727204611.61010: running TaskExecutor() for managed-node1/TASK: Delete veth interface lsr101 51385 1727204611.61081: in run() - task 0affcd87-79f5-6b1f-5706-000000000942 51385 1727204611.61093: variable 'ansible_search_path' from source: unknown 51385 1727204611.61096: variable 'ansible_search_path' from source: unknown 51385 1727204611.61131: calling self._execute() 51385 1727204611.61204: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.61208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.61216: variable 'omit' from source: magic vars 51385 1727204611.61492: variable 'ansible_distribution_major_version' from source: facts 51385 1727204611.61502: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204611.61643: variable 'type' from source: play vars 51385 1727204611.61647: variable 'state' from source: include params 51385 1727204611.61652: variable 'interface' from source: play vars 51385 1727204611.61655: variable 'current_interfaces' from source: set_fact 51385 1727204611.61669: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 51385 1727204611.61673: variable 'omit' from source: magic vars 51385 1727204611.61700: variable 'omit' from source: magic vars 51385 1727204611.61767: variable 'interface' from source: play vars 51385 1727204611.61780: variable 'omit' from source: magic vars 51385 1727204611.61815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204611.61844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204611.61864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204611.61876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204611.61888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204611.61912: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204611.61915: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.61918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.61988: Set connection var ansible_pipelining to False 51385 1727204611.61992: Set connection var ansible_shell_type to sh 51385 1727204611.62000: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204611.62007: Set connection var ansible_timeout to 10 51385 1727204611.62010: Set connection var ansible_connection to ssh 51385 1727204611.62015: Set connection var ansible_shell_executable to /bin/sh 51385 1727204611.62034: variable 'ansible_shell_executable' from source: unknown 51385 1727204611.62037: variable 'ansible_connection' from source: unknown 51385 1727204611.62041: variable 'ansible_module_compression' from source: unknown 51385 1727204611.62043: variable 'ansible_shell_type' from source: unknown 51385 1727204611.62046: variable 'ansible_shell_executable' from source: unknown 51385 1727204611.62049: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204611.62051: variable 'ansible_pipelining' from source: unknown 51385 1727204611.62053: variable 'ansible_timeout' from source: unknown 51385 1727204611.62055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204611.62155: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204611.62166: variable 'omit' from source: magic vars 51385 1727204611.62172: starting attempt loop 51385 1727204611.62176: running the handler 51385 1727204611.62189: _low_level_execute_command(): starting 51385 1727204611.62195: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204611.62731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.62741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.62777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.62791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204611.62801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.62847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204611.62853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.62870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.62937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.64511: stdout chunk (state=3): >>>/root <<< 51385 1727204611.64618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.64673: stderr chunk (state=3): >>><<< 51385 1727204611.64676: stdout chunk (state=3): >>><<< 51385 1727204611.64698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204611.64710: _low_level_execute_command(): starting 51385 1727204611.64715: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791 `" && echo ansible-tmp-1727204611.646979-53309-44341528467791="` echo /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791 `" ) && sleep 0' 51385 1727204611.65171: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.65183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.65213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204611.65221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204611.65226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.65245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.65296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204611.65309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.65381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.67223: stdout chunk (state=3): >>>ansible-tmp-1727204611.646979-53309-44341528467791=/root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791 <<< 51385 1727204611.67332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.67397: stderr chunk (state=3): >>><<< 51385 1727204611.67400: stdout chunk (state=3): >>><<< 51385 1727204611.67417: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204611.646979-53309-44341528467791=/root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204611.67447: variable 'ansible_module_compression' from source: unknown 51385 1727204611.67496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204611.67526: variable 'ansible_facts' from source: unknown 51385 1727204611.67593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/AnsiballZ_command.py 51385 1727204611.67708: Sending initial data 51385 1727204611.67711: Sent initial data (154 bytes) 51385 1727204611.68425: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.68429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.68462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204611.68468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.68470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.68528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204611.68531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.68533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.68593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.70299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204611.70346: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204611.70400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpvuu9wtyt /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/AnsiballZ_command.py <<< 51385 1727204611.70452: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204611.71296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.71414: stderr chunk (state=3): >>><<< 51385 1727204611.71417: stdout chunk (state=3): >>><<< 51385 1727204611.71435: done transferring module to remote 51385 1727204611.71445: _low_level_execute_command(): starting 51385 1727204611.71450: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/ /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/AnsiballZ_command.py && sleep 0' 51385 1727204611.71919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.71925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.71965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.71969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 51385 1727204611.71977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.71989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.71996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.72056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.72065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.72132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.73829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.73915: stderr chunk (state=3): >>><<< 51385 1727204611.73920: stdout chunk (state=3): >>><<< 51385 1727204611.73945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204611.73951: _low_level_execute_command(): starting 51385 1727204611.73956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/AnsiballZ_command.py && sleep 0' 51385 1727204611.74691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204611.74700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.74716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.74730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.74780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204611.74787: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204611.74797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.74810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204611.74820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204611.74829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204611.74838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.74852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.74872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.74880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204611.74886: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204611.74896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.74972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204611.74995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.75006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.75099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.89510: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 15:03:31.881131", "end": "2024-09-24 15:03:31.893239", "delta": "0:00:00.012108", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204611.91073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204611.91078: stdout chunk (state=3): >>><<< 51385 1727204611.91085: stderr chunk (state=3): >>><<< 51385 1727204611.91108: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 15:03:31.881131", "end": "2024-09-24 15:03:31.893239", "delta": "0:00:00.012108", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204611.91149: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204611.91157: _low_level_execute_command(): starting 51385 1727204611.91165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204611.646979-53309-44341528467791/ > /dev/null 2>&1 && sleep 0' 51385 1727204611.92791: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204611.92799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.92810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.92825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.92867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204611.92936: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204611.92947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.92962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204611.92968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204611.92975: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204611.92984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204611.92997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204611.93008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204611.93015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204611.93021: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204611.93032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204611.93112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204611.93127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204611.93130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204611.93252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204611.95092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204611.95095: stdout chunk (state=3): >>><<< 51385 1727204611.95103: stderr chunk (state=3): >>><<< 51385 1727204611.95124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204611.95131: handler run complete 51385 1727204611.95155: Evaluated conditional (False): False 51385 1727204611.95167: attempt loop complete, returning result 51385 1727204611.95170: _execute() done 51385 1727204611.95172: dumping result to json 51385 1727204611.95178: done dumping result, returning 51385 1727204611.95186: done running TaskExecutor() for managed-node1/TASK: Delete veth interface lsr101 [0affcd87-79f5-6b1f-5706-000000000942] 51385 1727204611.95192: sending task result for task 0affcd87-79f5-6b1f-5706-000000000942 51385 1727204611.95296: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000942 51385 1727204611.95299: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.012108", "end": "2024-09-24 15:03:31.893239", "rc": 0, "start": "2024-09-24 15:03:31.881131" } 51385 1727204611.95534: no more pending results, returning what we have 51385 1727204611.95537: results queue empty 51385 1727204611.95538: checking for any_errors_fatal 51385 1727204611.95544: done checking for any_errors_fatal 51385 1727204611.95544: checking for max_fail_percentage 51385 1727204611.95546: done checking for max_fail_percentage 51385 1727204611.95547: checking to see if all hosts have failed and the running result is not ok 51385 1727204611.95548: done checking to see if all hosts have failed 51385 1727204611.95548: getting the remaining hosts for this loop 51385 1727204611.95550: done getting the remaining hosts for this loop 51385 1727204611.95553: getting the next task for host managed-node1 51385 1727204611.95559: done getting next task for host managed-node1 51385 1727204611.95562: ^ task is: TASK: Create dummy interface {{ interface }} 51385 1727204611.95566: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204611.95570: getting variables 51385 1727204611.95572: in VariableManager get_vars() 51385 1727204611.95613: Calling all_inventory to load vars for managed-node1 51385 1727204611.95615: Calling groups_inventory to load vars for managed-node1 51385 1727204611.95618: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204611.95628: Calling all_plugins_play to load vars for managed-node1 51385 1727204611.95631: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204611.95634: Calling groups_plugins_play to load vars for managed-node1 51385 1727204611.97870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.00648: done with get_vars() 51385 1727204612.00678: done getting variables 51385 1727204612.00743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204612.00866: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.403) 0:00:30.412 ***** 51385 1727204612.00898: entering _queue_task() for managed-node1/command 51385 1727204612.01237: worker is 1 (out of 1 available) 51385 1727204612.01249: exiting _queue_task() for managed-node1/command 51385 1727204612.01260: done queuing things up, now waiting for results queue to drain 51385 1727204612.01261: waiting for pending results... 51385 1727204612.01571: running TaskExecutor() for managed-node1/TASK: Create dummy interface lsr101 51385 1727204612.01666: in run() - task 0affcd87-79f5-6b1f-5706-000000000943 51385 1727204612.01678: variable 'ansible_search_path' from source: unknown 51385 1727204612.01682: variable 'ansible_search_path' from source: unknown 51385 1727204612.01725: calling self._execute() 51385 1727204612.01907: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.01913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.01923: variable 'omit' from source: magic vars 51385 1727204612.02361: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.02367: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.02600: variable 'type' from source: play vars 51385 1727204612.02610: variable 'state' from source: include params 51385 1727204612.02616: variable 'interface' from source: play vars 51385 1727204612.02619: variable 'current_interfaces' from source: set_fact 51385 1727204612.02627: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 51385 1727204612.02630: when evaluation is False, skipping this task 51385 1727204612.02633: _execute() done 51385 1727204612.02635: dumping result to json 51385 1727204612.02637: done dumping result, returning 51385 1727204612.02644: done running TaskExecutor() for managed-node1/TASK: Create dummy interface lsr101 [0affcd87-79f5-6b1f-5706-000000000943] 51385 1727204612.02651: sending task result for task 0affcd87-79f5-6b1f-5706-000000000943 51385 1727204612.02743: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000943 51385 1727204612.02747: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204612.02798: no more pending results, returning what we have 51385 1727204612.02803: results queue empty 51385 1727204612.02804: checking for any_errors_fatal 51385 1727204612.02814: done checking for any_errors_fatal 51385 1727204612.02815: checking for max_fail_percentage 51385 1727204612.02817: done checking for max_fail_percentage 51385 1727204612.02818: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.02819: done checking to see if all hosts have failed 51385 1727204612.02820: getting the remaining hosts for this loop 51385 1727204612.02821: done getting the remaining hosts for this loop 51385 1727204612.02825: getting the next task for host managed-node1 51385 1727204612.02832: done getting next task for host managed-node1 51385 1727204612.02835: ^ task is: TASK: Delete dummy interface {{ interface }} 51385 1727204612.02840: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.02844: getting variables 51385 1727204612.02846: in VariableManager get_vars() 51385 1727204612.02896: Calling all_inventory to load vars for managed-node1 51385 1727204612.02899: Calling groups_inventory to load vars for managed-node1 51385 1727204612.02902: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.02917: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.02921: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.02924: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.05428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.07259: done with get_vars() 51385 1727204612.07393: done getting variables 51385 1727204612.07453: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204612.07680: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.069) 0:00:30.482 ***** 51385 1727204612.07826: entering _queue_task() for managed-node1/command 51385 1727204612.08738: worker is 1 (out of 1 available) 51385 1727204612.08752: exiting _queue_task() for managed-node1/command 51385 1727204612.08766: done queuing things up, now waiting for results queue to drain 51385 1727204612.08768: waiting for pending results... 51385 1727204612.09151: running TaskExecutor() for managed-node1/TASK: Delete dummy interface lsr101 51385 1727204612.09216: in run() - task 0affcd87-79f5-6b1f-5706-000000000944 51385 1727204612.09230: variable 'ansible_search_path' from source: unknown 51385 1727204612.09235: variable 'ansible_search_path' from source: unknown 51385 1727204612.09279: calling self._execute() 51385 1727204612.09383: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.09387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.09399: variable 'omit' from source: magic vars 51385 1727204612.09807: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.09822: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.10034: variable 'type' from source: play vars 51385 1727204612.10038: variable 'state' from source: include params 51385 1727204612.10043: variable 'interface' from source: play vars 51385 1727204612.10046: variable 'current_interfaces' from source: set_fact 51385 1727204612.10056: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 51385 1727204612.10062: when evaluation is False, skipping this task 51385 1727204612.10068: _execute() done 51385 1727204612.10070: dumping result to json 51385 1727204612.10073: done dumping result, returning 51385 1727204612.10075: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface lsr101 [0affcd87-79f5-6b1f-5706-000000000944] 51385 1727204612.10088: sending task result for task 0affcd87-79f5-6b1f-5706-000000000944 51385 1727204612.10186: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000944 51385 1727204612.10190: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204612.10244: no more pending results, returning what we have 51385 1727204612.10249: results queue empty 51385 1727204612.10250: checking for any_errors_fatal 51385 1727204612.10258: done checking for any_errors_fatal 51385 1727204612.10259: checking for max_fail_percentage 51385 1727204612.10261: done checking for max_fail_percentage 51385 1727204612.10262: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.10262: done checking to see if all hosts have failed 51385 1727204612.10263: getting the remaining hosts for this loop 51385 1727204612.10267: done getting the remaining hosts for this loop 51385 1727204612.10272: getting the next task for host managed-node1 51385 1727204612.10279: done getting next task for host managed-node1 51385 1727204612.10282: ^ task is: TASK: Create tap interface {{ interface }} 51385 1727204612.10286: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.10290: getting variables 51385 1727204612.10293: in VariableManager get_vars() 51385 1727204612.10340: Calling all_inventory to load vars for managed-node1 51385 1727204612.10344: Calling groups_inventory to load vars for managed-node1 51385 1727204612.10347: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.10360: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.10366: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.10370: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.12190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.14218: done with get_vars() 51385 1727204612.14253: done getting variables 51385 1727204612.14318: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204612.14438: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.066) 0:00:30.548 ***** 51385 1727204612.14476: entering _queue_task() for managed-node1/command 51385 1727204612.15071: worker is 1 (out of 1 available) 51385 1727204612.15084: exiting _queue_task() for managed-node1/command 51385 1727204612.15095: done queuing things up, now waiting for results queue to drain 51385 1727204612.15097: waiting for pending results... 51385 1727204612.16132: running TaskExecutor() for managed-node1/TASK: Create tap interface lsr101 51385 1727204612.16355: in run() - task 0affcd87-79f5-6b1f-5706-000000000945 51385 1727204612.16384: variable 'ansible_search_path' from source: unknown 51385 1727204612.16393: variable 'ansible_search_path' from source: unknown 51385 1727204612.16436: calling self._execute() 51385 1727204612.16590: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.16606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.16627: variable 'omit' from source: magic vars 51385 1727204612.17468: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.17481: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.17741: variable 'type' from source: play vars 51385 1727204612.17744: variable 'state' from source: include params 51385 1727204612.17746: variable 'interface' from source: play vars 51385 1727204612.17749: variable 'current_interfaces' from source: set_fact 51385 1727204612.17986: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 51385 1727204612.17989: when evaluation is False, skipping this task 51385 1727204612.17993: _execute() done 51385 1727204612.17995: dumping result to json 51385 1727204612.17998: done dumping result, returning 51385 1727204612.18003: done running TaskExecutor() for managed-node1/TASK: Create tap interface lsr101 [0affcd87-79f5-6b1f-5706-000000000945] 51385 1727204612.18011: sending task result for task 0affcd87-79f5-6b1f-5706-000000000945 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204612.18155: no more pending results, returning what we have 51385 1727204612.18159: results queue empty 51385 1727204612.18160: checking for any_errors_fatal 51385 1727204612.18169: done checking for any_errors_fatal 51385 1727204612.18170: checking for max_fail_percentage 51385 1727204612.18172: done checking for max_fail_percentage 51385 1727204612.18173: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.18174: done checking to see if all hosts have failed 51385 1727204612.18176: getting the remaining hosts for this loop 51385 1727204612.18177: done getting the remaining hosts for this loop 51385 1727204612.18182: getting the next task for host managed-node1 51385 1727204612.18188: done getting next task for host managed-node1 51385 1727204612.18192: ^ task is: TASK: Delete tap interface {{ interface }} 51385 1727204612.18195: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.18199: getting variables 51385 1727204612.18202: in VariableManager get_vars() 51385 1727204612.18251: Calling all_inventory to load vars for managed-node1 51385 1727204612.18254: Calling groups_inventory to load vars for managed-node1 51385 1727204612.18256: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.18281: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.18285: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.18290: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.18811: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000945 51385 1727204612.18815: WORKER PROCESS EXITING 51385 1727204612.20400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.22120: done with get_vars() 51385 1727204612.22157: done getting variables 51385 1727204612.22227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 51385 1727204612.22353: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.079) 0:00:30.627 ***** 51385 1727204612.22391: entering _queue_task() for managed-node1/command 51385 1727204612.22756: worker is 1 (out of 1 available) 51385 1727204612.22775: exiting _queue_task() for managed-node1/command 51385 1727204612.22793: done queuing things up, now waiting for results queue to drain 51385 1727204612.22795: waiting for pending results... 51385 1727204612.23114: running TaskExecutor() for managed-node1/TASK: Delete tap interface lsr101 51385 1727204612.23242: in run() - task 0affcd87-79f5-6b1f-5706-000000000946 51385 1727204612.23271: variable 'ansible_search_path' from source: unknown 51385 1727204612.23280: variable 'ansible_search_path' from source: unknown 51385 1727204612.23324: calling self._execute() 51385 1727204612.23437: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.23449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.23470: variable 'omit' from source: magic vars 51385 1727204612.23868: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.23885: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.24089: variable 'type' from source: play vars 51385 1727204612.24099: variable 'state' from source: include params 51385 1727204612.24107: variable 'interface' from source: play vars 51385 1727204612.24119: variable 'current_interfaces' from source: set_fact 51385 1727204612.24130: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 51385 1727204612.24136: when evaluation is False, skipping this task 51385 1727204612.24143: _execute() done 51385 1727204612.24151: dumping result to json 51385 1727204612.24157: done dumping result, returning 51385 1727204612.24170: done running TaskExecutor() for managed-node1/TASK: Delete tap interface lsr101 [0affcd87-79f5-6b1f-5706-000000000946] 51385 1727204612.24181: sending task result for task 0affcd87-79f5-6b1f-5706-000000000946 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 51385 1727204612.24313: no more pending results, returning what we have 51385 1727204612.24317: results queue empty 51385 1727204612.24318: checking for any_errors_fatal 51385 1727204612.24326: done checking for any_errors_fatal 51385 1727204612.24327: checking for max_fail_percentage 51385 1727204612.24329: done checking for max_fail_percentage 51385 1727204612.24330: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.24331: done checking to see if all hosts have failed 51385 1727204612.24332: getting the remaining hosts for this loop 51385 1727204612.24334: done getting the remaining hosts for this loop 51385 1727204612.24338: getting the next task for host managed-node1 51385 1727204612.24348: done getting next task for host managed-node1 51385 1727204612.24352: ^ task is: TASK: Verify network state restored to default 51385 1727204612.24354: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.24361: getting variables 51385 1727204612.24363: in VariableManager get_vars() 51385 1727204612.24410: Calling all_inventory to load vars for managed-node1 51385 1727204612.24414: Calling groups_inventory to load vars for managed-node1 51385 1727204612.24416: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.24431: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.24434: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.24437: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.25483: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000946 51385 1727204612.25487: WORKER PROCESS EXITING 51385 1727204612.26194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.27912: done with get_vars() 51385 1727204612.27938: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.056) 0:00:30.684 ***** 51385 1727204612.28034: entering _queue_task() for managed-node1/include_tasks 51385 1727204612.28357: worker is 1 (out of 1 available) 51385 1727204612.28372: exiting _queue_task() for managed-node1/include_tasks 51385 1727204612.28384: done queuing things up, now waiting for results queue to drain 51385 1727204612.28385: waiting for pending results... 51385 1727204612.28691: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 51385 1727204612.28811: in run() - task 0affcd87-79f5-6b1f-5706-0000000000ab 51385 1727204612.28835: variable 'ansible_search_path' from source: unknown 51385 1727204612.28887: calling self._execute() 51385 1727204612.29008: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.29022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.29039: variable 'omit' from source: magic vars 51385 1727204612.29466: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.29489: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.29500: _execute() done 51385 1727204612.29507: dumping result to json 51385 1727204612.29515: done dumping result, returning 51385 1727204612.29524: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [0affcd87-79f5-6b1f-5706-0000000000ab] 51385 1727204612.29534: sending task result for task 0affcd87-79f5-6b1f-5706-0000000000ab 51385 1727204612.29679: no more pending results, returning what we have 51385 1727204612.29685: in VariableManager get_vars() 51385 1727204612.29737: Calling all_inventory to load vars for managed-node1 51385 1727204612.29740: Calling groups_inventory to load vars for managed-node1 51385 1727204612.29742: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.29757: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.29765: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.29769: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.30785: done sending task result for task 0affcd87-79f5-6b1f-5706-0000000000ab 51385 1727204612.30788: WORKER PROCESS EXITING 51385 1727204612.31539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.33281: done with get_vars() 51385 1727204612.33312: variable 'ansible_search_path' from source: unknown 51385 1727204612.33331: we have included files to process 51385 1727204612.33333: generating all_blocks data 51385 1727204612.33334: done generating all_blocks data 51385 1727204612.33340: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 51385 1727204612.33341: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 51385 1727204612.33344: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 51385 1727204612.33792: done processing included file 51385 1727204612.33795: iterating over new_blocks loaded from include file 51385 1727204612.33796: in VariableManager get_vars() 51385 1727204612.33818: done with get_vars() 51385 1727204612.33820: filtering new block on tags 51385 1727204612.33840: done filtering new block on tags 51385 1727204612.33843: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 51385 1727204612.33849: extending task lists for all hosts with included blocks 51385 1727204612.37225: done extending task lists 51385 1727204612.37227: done processing included files 51385 1727204612.37228: results queue empty 51385 1727204612.37229: checking for any_errors_fatal 51385 1727204612.37232: done checking for any_errors_fatal 51385 1727204612.37233: checking for max_fail_percentage 51385 1727204612.37234: done checking for max_fail_percentage 51385 1727204612.37236: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.37237: done checking to see if all hosts have failed 51385 1727204612.37239: getting the remaining hosts for this loop 51385 1727204612.37240: done getting the remaining hosts for this loop 51385 1727204612.37243: getting the next task for host managed-node1 51385 1727204612.37247: done getting next task for host managed-node1 51385 1727204612.37250: ^ task is: TASK: Check routes and DNS 51385 1727204612.37252: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.37255: getting variables 51385 1727204612.37256: in VariableManager get_vars() 51385 1727204612.37279: Calling all_inventory to load vars for managed-node1 51385 1727204612.37282: Calling groups_inventory to load vars for managed-node1 51385 1727204612.37284: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.37291: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.37293: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.37296: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.38723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.40463: done with get_vars() 51385 1727204612.40492: done getting variables 51385 1727204612.40546: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.125) 0:00:30.809 ***** 51385 1727204612.40580: entering _queue_task() for managed-node1/shell 51385 1727204612.40930: worker is 1 (out of 1 available) 51385 1727204612.40942: exiting _queue_task() for managed-node1/shell 51385 1727204612.40954: done queuing things up, now waiting for results queue to drain 51385 1727204612.40955: waiting for pending results... 51385 1727204612.41270: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 51385 1727204612.41389: in run() - task 0affcd87-79f5-6b1f-5706-000000000b17 51385 1727204612.41416: variable 'ansible_search_path' from source: unknown 51385 1727204612.41423: variable 'ansible_search_path' from source: unknown 51385 1727204612.41468: calling self._execute() 51385 1727204612.41576: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.41587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.41601: variable 'omit' from source: magic vars 51385 1727204612.42018: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.42036: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.42053: variable 'omit' from source: magic vars 51385 1727204612.42109: variable 'omit' from source: magic vars 51385 1727204612.42149: variable 'omit' from source: magic vars 51385 1727204612.42208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204612.42250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204612.42289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204612.42314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204612.42330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204612.42371: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204612.42385: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.42396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.42512: Set connection var ansible_pipelining to False 51385 1727204612.42521: Set connection var ansible_shell_type to sh 51385 1727204612.42535: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204612.42548: Set connection var ansible_timeout to 10 51385 1727204612.42555: Set connection var ansible_connection to ssh 51385 1727204612.42571: Set connection var ansible_shell_executable to /bin/sh 51385 1727204612.42605: variable 'ansible_shell_executable' from source: unknown 51385 1727204612.42615: variable 'ansible_connection' from source: unknown 51385 1727204612.42622: variable 'ansible_module_compression' from source: unknown 51385 1727204612.42628: variable 'ansible_shell_type' from source: unknown 51385 1727204612.42634: variable 'ansible_shell_executable' from source: unknown 51385 1727204612.42640: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.42647: variable 'ansible_pipelining' from source: unknown 51385 1727204612.42653: variable 'ansible_timeout' from source: unknown 51385 1727204612.42662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.42813: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204612.42839: variable 'omit' from source: magic vars 51385 1727204612.42851: starting attempt loop 51385 1727204612.42857: running the handler 51385 1727204612.42876: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204612.42901: _low_level_execute_command(): starting 51385 1727204612.42913: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204612.43746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.43766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.43782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.43807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.43856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.43873: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.43888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.43905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.43921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.43934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.43945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.43958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.43979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.43991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.44002: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.44014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.44103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.44127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.44150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.44246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.45928: stdout chunk (state=3): >>>/root <<< 51385 1727204612.46084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.46147: stderr chunk (state=3): >>><<< 51385 1727204612.46151: stdout chunk (state=3): >>><<< 51385 1727204612.46281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.46285: _low_level_execute_command(): starting 51385 1727204612.46296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897 `" && echo ansible-tmp-1727204612.4618027-53343-264153716547897="` echo /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897 `" ) && sleep 0' 51385 1727204612.47076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.47092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.47107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.47127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.47182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.47194: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.47207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.47223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.47233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.47248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.47271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.47293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.47309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.47322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.47335: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.47349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.47434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.47457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.47479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.47579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.49433: stdout chunk (state=3): >>>ansible-tmp-1727204612.4618027-53343-264153716547897=/root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897 <<< 51385 1727204612.49548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.49640: stderr chunk (state=3): >>><<< 51385 1727204612.49652: stdout chunk (state=3): >>><<< 51385 1727204612.49971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.4618027-53343-264153716547897=/root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.49975: variable 'ansible_module_compression' from source: unknown 51385 1727204612.49978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204612.49980: variable 'ansible_facts' from source: unknown 51385 1727204612.49982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/AnsiballZ_command.py 51385 1727204612.50082: Sending initial data 51385 1727204612.50085: Sent initial data (156 bytes) 51385 1727204612.50991: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.51001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.51011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.51023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.51054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.51072: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.51103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.51106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.51108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204612.51111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.51152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.51159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.51236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.52942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204612.52988: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204612.53046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmpdc4jqim9 /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/AnsiballZ_command.py <<< 51385 1727204612.53094: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204612.54302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.54429: stderr chunk (state=3): >>><<< 51385 1727204612.54432: stdout chunk (state=3): >>><<< 51385 1727204612.54449: done transferring module to remote 51385 1727204612.54459: _low_level_execute_command(): starting 51385 1727204612.54467: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/ /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/AnsiballZ_command.py && sleep 0' 51385 1727204612.54913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.54927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.54939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.54956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204612.54972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.54983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.55039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.55045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.55106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.56826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.56891: stderr chunk (state=3): >>><<< 51385 1727204612.56895: stdout chunk (state=3): >>><<< 51385 1727204612.56934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.56938: _low_level_execute_command(): starting 51385 1727204612.56940: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/AnsiballZ_command.py && sleep 0' 51385 1727204612.57600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.57609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.57619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.57634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.57676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.57680: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.57688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.57701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.57709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.57715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.57723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.57731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.57742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.57753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.57756: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.57765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.57840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.57870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.57874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.57964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.71784: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3549sec preferred_lft 3549sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:03:32.709240", "end": "2024-09-24 15:03:32.717110", "delta": "0:00:00.007870", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204612.72951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204612.72956: stdout chunk (state=3): >>><<< 51385 1727204612.72966: stderr chunk (state=3): >>><<< 51385 1727204612.72984: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3549sec preferred_lft 3549sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:03:32.709240", "end": "2024-09-24 15:03:32.717110", "delta": "0:00:00.007870", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204612.73036: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204612.73043: _low_level_execute_command(): starting 51385 1727204612.73049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.4618027-53343-264153716547897/ > /dev/null 2>&1 && sleep 0' 51385 1727204612.73691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.73694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.73705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.73719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.73762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.73768: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.73778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.73791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.73798: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.73805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.73813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.73822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.73833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.73843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.73846: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.73855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.73931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.73955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.73962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.74063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.75839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.75843: stderr chunk (state=3): >>><<< 51385 1727204612.75852: stdout chunk (state=3): >>><<< 51385 1727204612.75867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.75874: handler run complete 51385 1727204612.75899: Evaluated conditional (False): False 51385 1727204612.75909: attempt loop complete, returning result 51385 1727204612.75912: _execute() done 51385 1727204612.75915: dumping result to json 51385 1727204612.75922: done dumping result, returning 51385 1727204612.75930: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [0affcd87-79f5-6b1f-5706-000000000b17] 51385 1727204612.75936: sending task result for task 0affcd87-79f5-6b1f-5706-000000000b17 51385 1727204612.76051: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000b17 51385 1727204612.76053: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.007870", "end": "2024-09-24 15:03:32.717110", "rc": 0, "start": "2024-09-24 15:03:32.709240" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3549sec preferred_lft 3549sec inet6 fe80::108f:92ff:fee7:c1ab/64 scope link valid_lft forever preferred_lft forever 21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 51385 1727204612.76138: no more pending results, returning what we have 51385 1727204612.76142: results queue empty 51385 1727204612.76143: checking for any_errors_fatal 51385 1727204612.76145: done checking for any_errors_fatal 51385 1727204612.76145: checking for max_fail_percentage 51385 1727204612.76147: done checking for max_fail_percentage 51385 1727204612.76148: checking to see if all hosts have failed and the running result is not ok 51385 1727204612.76149: done checking to see if all hosts have failed 51385 1727204612.76150: getting the remaining hosts for this loop 51385 1727204612.76152: done getting the remaining hosts for this loop 51385 1727204612.76155: getting the next task for host managed-node1 51385 1727204612.76165: done getting next task for host managed-node1 51385 1727204612.76168: ^ task is: TASK: Verify DNS and network connectivity 51385 1727204612.76170: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204612.76175: getting variables 51385 1727204612.76177: in VariableManager get_vars() 51385 1727204612.76217: Calling all_inventory to load vars for managed-node1 51385 1727204612.76219: Calling groups_inventory to load vars for managed-node1 51385 1727204612.76221: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204612.76232: Calling all_plugins_play to load vars for managed-node1 51385 1727204612.76234: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204612.76237: Calling groups_plugins_play to load vars for managed-node1 51385 1727204612.78292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204612.82753: done with get_vars() 51385 1727204612.82789: done getting variables 51385 1727204612.82852: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.423) 0:00:31.232 ***** 51385 1727204612.82889: entering _queue_task() for managed-node1/shell 51385 1727204612.83230: worker is 1 (out of 1 available) 51385 1727204612.83244: exiting _queue_task() for managed-node1/shell 51385 1727204612.83257: done queuing things up, now waiting for results queue to drain 51385 1727204612.83258: waiting for pending results... 51385 1727204612.83558: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 51385 1727204612.83656: in run() - task 0affcd87-79f5-6b1f-5706-000000000b18 51385 1727204612.83682: variable 'ansible_search_path' from source: unknown 51385 1727204612.83687: variable 'ansible_search_path' from source: unknown 51385 1727204612.83714: calling self._execute() 51385 1727204612.83809: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.83813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.83823: variable 'omit' from source: magic vars 51385 1727204612.84192: variable 'ansible_distribution_major_version' from source: facts 51385 1727204612.84205: Evaluated conditional (ansible_distribution_major_version != '6'): True 51385 1727204612.84345: variable 'ansible_facts' from source: unknown 51385 1727204612.85493: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 51385 1727204612.85498: variable 'omit' from source: magic vars 51385 1727204612.85536: variable 'omit' from source: magic vars 51385 1727204612.85586: variable 'omit' from source: magic vars 51385 1727204612.85624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51385 1727204612.85668: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51385 1727204612.85688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51385 1727204612.85705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204612.85716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51385 1727204612.85745: variable 'inventory_hostname' from source: host vars for 'managed-node1' 51385 1727204612.85749: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.85751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.85851: Set connection var ansible_pipelining to False 51385 1727204612.85854: Set connection var ansible_shell_type to sh 51385 1727204612.85866: Set connection var ansible_module_compression to ZIP_DEFLATED 51385 1727204612.85878: Set connection var ansible_timeout to 10 51385 1727204612.85881: Set connection var ansible_connection to ssh 51385 1727204612.85885: Set connection var ansible_shell_executable to /bin/sh 51385 1727204612.85907: variable 'ansible_shell_executable' from source: unknown 51385 1727204612.85911: variable 'ansible_connection' from source: unknown 51385 1727204612.85914: variable 'ansible_module_compression' from source: unknown 51385 1727204612.85916: variable 'ansible_shell_type' from source: unknown 51385 1727204612.85919: variable 'ansible_shell_executable' from source: unknown 51385 1727204612.85921: variable 'ansible_host' from source: host vars for 'managed-node1' 51385 1727204612.85923: variable 'ansible_pipelining' from source: unknown 51385 1727204612.85925: variable 'ansible_timeout' from source: unknown 51385 1727204612.85930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 51385 1727204612.86071: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204612.86077: variable 'omit' from source: magic vars 51385 1727204612.86082: starting attempt loop 51385 1727204612.86085: running the handler 51385 1727204612.86102: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51385 1727204612.86122: _low_level_execute_command(): starting 51385 1727204612.86130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51385 1727204612.86878: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.86891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.86903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.86918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.86956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.86967: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.86984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.86997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.87006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.87012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.87020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.87030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.87041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.87048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.87055: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.87066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.87143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.87166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.87175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.87319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.88842: stdout chunk (state=3): >>>/root <<< 51385 1727204612.89019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.89022: stdout chunk (state=3): >>><<< 51385 1727204612.89031: stderr chunk (state=3): >>><<< 51385 1727204612.89065: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.89076: _low_level_execute_command(): starting 51385 1727204612.89084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806 `" && echo ansible-tmp-1727204612.8906155-53366-230533374003806="` echo /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806 `" ) && sleep 0' 51385 1727204612.90753: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.90757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.90901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 51385 1727204612.90904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.90984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.90990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204612.91111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.91940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.92279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.92297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.92390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.94222: stdout chunk (state=3): >>>ansible-tmp-1727204612.8906155-53366-230533374003806=/root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806 <<< 51385 1727204612.94425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204612.94428: stdout chunk (state=3): >>><<< 51385 1727204612.94430: stderr chunk (state=3): >>><<< 51385 1727204612.94762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.8906155-53366-230533374003806=/root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204612.94768: variable 'ansible_module_compression' from source: unknown 51385 1727204612.94771: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51385tpxlmlox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 51385 1727204612.94773: variable 'ansible_facts' from source: unknown 51385 1727204612.94775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/AnsiballZ_command.py 51385 1727204612.95170: Sending initial data 51385 1727204612.95173: Sent initial data (156 bytes) 51385 1727204612.97365: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204612.97414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.97433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.97453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.97507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.97575: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204612.97590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.97630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204612.97644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204612.97656: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204612.97675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204612.97689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204612.97744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204612.97756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204612.97773: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204612.97789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204612.97986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204612.98011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204612.98028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204612.98123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204612.99870: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 51385 1727204612.99896: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 51385 1727204613.00005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51385tpxlmlox/tmp7x2nbykk /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/AnsiballZ_command.py <<< 51385 1727204613.00010: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 51385 1727204613.01484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204613.01569: stderr chunk (state=3): >>><<< 51385 1727204613.01573: stdout chunk (state=3): >>><<< 51385 1727204613.01593: done transferring module to remote 51385 1727204613.01604: _low_level_execute_command(): starting 51385 1727204613.01611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/ /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/AnsiballZ_command.py && sleep 0' 51385 1727204613.03186: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204613.03200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204613.03215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204613.03229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204613.03273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204613.03280: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204613.03290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.03306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204613.03318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204613.03325: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204613.03334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204613.03348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204613.03361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204613.03367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204613.03375: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204613.03385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.03466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204613.03483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204613.03486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204613.03645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204613.05402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204613.05406: stdout chunk (state=3): >>><<< 51385 1727204613.05415: stderr chunk (state=3): >>><<< 51385 1727204613.05437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204613.05440: _low_level_execute_command(): starting 51385 1727204613.05445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/AnsiballZ_command.py && sleep 0' 51385 1727204613.07231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204613.07235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204613.07286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.07290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204613.07306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 51385 1727204613.07312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.07393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204613.07547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204613.07553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204613.07650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204613.33831: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3630 0 --:--:-- --:--:-- --:--:-- 3674\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:03:33.205594", "end": "2024-09-24 15:03:33.337555", "delta": "0:00:00.131961", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 51385 1727204613.35039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 51385 1727204613.35122: stderr chunk (state=3): >>><<< 51385 1727204613.35152: stdout chunk (state=3): >>><<< 51385 1727204613.35311: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3630 0 --:--:-- --:--:-- --:--:-- 3674\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:03:33.205594", "end": "2024-09-24 15:03:33.337555", "delta": "0:00:00.131961", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 51385 1727204613.35315: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51385 1727204613.35318: _low_level_execute_command(): starting 51385 1727204613.35320: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.8906155-53366-230533374003806/ > /dev/null 2>&1 && sleep 0' 51385 1727204613.35980: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51385 1727204613.35995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204613.36010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204613.36028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204613.36076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204613.36089: stderr chunk (state=3): >>>debug2: match not found <<< 51385 1727204613.36103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.36120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51385 1727204613.36130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 51385 1727204613.36140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51385 1727204613.36151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51385 1727204613.36163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51385 1727204613.36185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51385 1727204613.36196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 51385 1727204613.36206: stderr chunk (state=3): >>>debug2: match found <<< 51385 1727204613.36217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51385 1727204613.36298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 51385 1727204613.36320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51385 1727204613.36335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51385 1727204613.36420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51385 1727204613.38592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51385 1727204613.38695: stderr chunk (state=3): >>><<< 51385 1727204613.38709: stdout chunk (state=3): >>><<< 51385 1727204613.39141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51385 1727204613.39145: handler run complete 51385 1727204613.39147: Evaluated conditional (False): False 51385 1727204613.39150: attempt loop complete, returning result 51385 1727204613.39152: _execute() done 51385 1727204613.39154: dumping result to json 51385 1727204613.39156: done dumping result, returning 51385 1727204613.39158: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [0affcd87-79f5-6b1f-5706-000000000b18] 51385 1727204613.39162: sending task result for task 0affcd87-79f5-6b1f-5706-000000000b18 51385 1727204613.39375: done sending task result for task 0affcd87-79f5-6b1f-5706-000000000b18 51385 1727204613.39378: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.131961", "end": "2024-09-24 15:03:33.337555", "rc": 0, "start": "2024-09-24 15:03:33.205594" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3630 0 --:--:-- --:--:-- --:--:-- 3674 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125 51385 1727204613.39451: no more pending results, returning what we have 51385 1727204613.39454: results queue empty 51385 1727204613.39456: checking for any_errors_fatal 51385 1727204613.39469: done checking for any_errors_fatal 51385 1727204613.39470: checking for max_fail_percentage 51385 1727204613.39472: done checking for max_fail_percentage 51385 1727204613.39473: checking to see if all hosts have failed and the running result is not ok 51385 1727204613.39474: done checking to see if all hosts have failed 51385 1727204613.39476: getting the remaining hosts for this loop 51385 1727204613.39477: done getting the remaining hosts for this loop 51385 1727204613.39482: getting the next task for host managed-node1 51385 1727204613.39490: done getting next task for host managed-node1 51385 1727204613.39492: ^ task is: TASK: meta (flush_handlers) 51385 1727204613.39493: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204613.39498: getting variables 51385 1727204613.39500: in VariableManager get_vars() 51385 1727204613.39544: Calling all_inventory to load vars for managed-node1 51385 1727204613.39547: Calling groups_inventory to load vars for managed-node1 51385 1727204613.39550: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204613.39697: Calling all_plugins_play to load vars for managed-node1 51385 1727204613.39702: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204613.39707: Calling groups_plugins_play to load vars for managed-node1 51385 1727204613.42499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204613.44405: done with get_vars() 51385 1727204613.44439: done getting variables 51385 1727204613.44522: in VariableManager get_vars() 51385 1727204613.44540: Calling all_inventory to load vars for managed-node1 51385 1727204613.44543: Calling groups_inventory to load vars for managed-node1 51385 1727204613.44545: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204613.44550: Calling all_plugins_play to load vars for managed-node1 51385 1727204613.44552: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204613.44557: Calling groups_plugins_play to load vars for managed-node1 51385 1727204613.46765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204613.48729: done with get_vars() 51385 1727204613.48762: done queuing things up, now waiting for results queue to drain 51385 1727204613.48766: results queue empty 51385 1727204613.48767: checking for any_errors_fatal 51385 1727204613.48771: done checking for any_errors_fatal 51385 1727204613.48772: checking for max_fail_percentage 51385 1727204613.48773: done checking for max_fail_percentage 51385 1727204613.48774: checking to see if all hosts have failed and the running result is not ok 51385 1727204613.48775: done checking to see if all hosts have failed 51385 1727204613.48775: getting the remaining hosts for this loop 51385 1727204613.48776: done getting the remaining hosts for this loop 51385 1727204613.48779: getting the next task for host managed-node1 51385 1727204613.48783: done getting next task for host managed-node1 51385 1727204613.48785: ^ task is: TASK: meta (flush_handlers) 51385 1727204613.48786: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204613.48796: getting variables 51385 1727204613.48797: in VariableManager get_vars() 51385 1727204613.48812: Calling all_inventory to load vars for managed-node1 51385 1727204613.48815: Calling groups_inventory to load vars for managed-node1 51385 1727204613.48817: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204613.48827: Calling all_plugins_play to load vars for managed-node1 51385 1727204613.48830: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204613.48833: Calling groups_plugins_play to load vars for managed-node1 51385 1727204613.50073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204613.51832: done with get_vars() 51385 1727204613.51866: done getting variables 51385 1727204613.51925: in VariableManager get_vars() 51385 1727204613.51941: Calling all_inventory to load vars for managed-node1 51385 1727204613.51944: Calling groups_inventory to load vars for managed-node1 51385 1727204613.51946: Calling all_plugins_inventory to load vars for managed-node1 51385 1727204613.51951: Calling all_plugins_play to load vars for managed-node1 51385 1727204613.51953: Calling groups_plugins_inventory to load vars for managed-node1 51385 1727204613.51956: Calling groups_plugins_play to load vars for managed-node1 51385 1727204613.53320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51385 1727204613.55109: done with get_vars() 51385 1727204613.55144: done queuing things up, now waiting for results queue to drain 51385 1727204613.55147: results queue empty 51385 1727204613.55147: checking for any_errors_fatal 51385 1727204613.55149: done checking for any_errors_fatal 51385 1727204613.55150: checking for max_fail_percentage 51385 1727204613.55151: done checking for max_fail_percentage 51385 1727204613.55151: checking to see if all hosts have failed and the running result is not ok 51385 1727204613.55152: done checking to see if all hosts have failed 51385 1727204613.55153: getting the remaining hosts for this loop 51385 1727204613.55154: done getting the remaining hosts for this loop 51385 1727204613.55157: getting the next task for host managed-node1 51385 1727204613.55163: done getting next task for host managed-node1 51385 1727204613.55165: ^ task is: None 51385 1727204613.55167: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51385 1727204613.55168: done queuing things up, now waiting for results queue to drain 51385 1727204613.55169: results queue empty 51385 1727204613.55170: checking for any_errors_fatal 51385 1727204613.55171: done checking for any_errors_fatal 51385 1727204613.55171: checking for max_fail_percentage 51385 1727204613.55172: done checking for max_fail_percentage 51385 1727204613.55173: checking to see if all hosts have failed and the running result is not ok 51385 1727204613.55174: done checking to see if all hosts have failed 51385 1727204613.55176: getting the next task for host managed-node1 51385 1727204613.55178: done getting next task for host managed-node1 51385 1727204613.55179: ^ task is: None 51385 1727204613.55180: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=79 changed=2 unreachable=0 failed=0 skipped=67 rescued=0 ignored=0 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.723) 0:00:31.956 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 1.93s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.74s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.44s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.28s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface lsr101 -------------------------------------------- 1.26s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Install iproute --------------------------------------------------------- 1.19s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.90s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.82s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.77s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.77s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Verify DNS and network connectivity ------------------------------------- 0.72s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Gather current interface info ------------------------------------------- 0.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Set up veth as managed by NetworkManager -------------------------------- 0.51s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 51385 1727204613.55314: RUNNING CLEANUP