[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 24971 1727096411.90984: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 24971 1727096411.92228: Added group all to inventory 24971 1727096411.92231: Added group ungrouped to inventory 24971 1727096411.92235: Group all now contains ungrouped 24971 1727096411.92238: Examining possible inventory source: /tmp/network-EuO/inventory.yml 24971 1727096412.26097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 24971 1727096412.26192: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 24971 1727096412.26216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 24971 1727096412.26281: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 24971 1727096412.26361: Loaded config def from plugin (inventory/script) 24971 1727096412.26363: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 24971 1727096412.26412: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 24971 1727096412.26524: Loaded config def from plugin (inventory/yaml) 24971 1727096412.26526: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 24971 1727096412.26636: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 24971 1727096412.27125: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 24971 1727096412.27128: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 24971 1727096412.27131: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 24971 1727096412.27137: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 24971 1727096412.27142: Loading data from /tmp/network-EuO/inventory.yml 24971 1727096412.27221: /tmp/network-EuO/inventory.yml was not parsable by auto 24971 1727096412.27296: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 24971 1727096412.27337: Loading data from /tmp/network-EuO/inventory.yml 24971 1727096412.27428: group all already in inventory 24971 1727096412.27435: set inventory_file for managed_node1 24971 1727096412.27439: set inventory_dir for managed_node1 24971 1727096412.27440: Added host managed_node1 to inventory 24971 1727096412.27443: Added host managed_node1 to group all 24971 1727096412.27445: set ansible_host for managed_node1 24971 1727096412.27446: set ansible_ssh_extra_args for managed_node1 24971 1727096412.27449: set inventory_file for managed_node2 24971 1727096412.27452: set inventory_dir for managed_node2 24971 1727096412.27453: Added host managed_node2 to inventory 24971 1727096412.27454: Added host managed_node2 to group all 24971 1727096412.27455: set ansible_host for managed_node2 24971 1727096412.27456: set ansible_ssh_extra_args for managed_node2 24971 1727096412.27458: set inventory_file for managed_node3 24971 1727096412.27461: set inventory_dir for managed_node3 24971 1727096412.27461: Added host managed_node3 to inventory 24971 1727096412.27463: Added host managed_node3 to group all 24971 1727096412.27463: set ansible_host for managed_node3 24971 1727096412.27464: set ansible_ssh_extra_args for managed_node3 24971 1727096412.27467: Reconcile groups and hosts in inventory. 24971 1727096412.27474: Group ungrouped now contains managed_node1 24971 1727096412.27476: Group ungrouped now contains managed_node2 24971 1727096412.27478: Group ungrouped now contains managed_node3 24971 1727096412.27572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 24971 1727096412.27714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 24971 1727096412.27762: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 24971 1727096412.27794: Loaded config def from plugin (vars/host_group_vars) 24971 1727096412.27797: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 24971 1727096412.27804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 24971 1727096412.27816: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 24971 1727096412.27860: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 24971 1727096412.28247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096412.28355: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 24971 1727096412.28405: Loaded config def from plugin (connection/local) 24971 1727096412.28408: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 24971 1727096412.29125: Loaded config def from plugin (connection/paramiko_ssh) 24971 1727096412.29128: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 24971 1727096412.30084: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24971 1727096412.30122: Loaded config def from plugin (connection/psrp) 24971 1727096412.30125: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 24971 1727096412.30874: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24971 1727096412.30918: Loaded config def from plugin (connection/ssh) 24971 1727096412.30921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 24971 1727096412.32899: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24971 1727096412.32942: Loaded config def from plugin (connection/winrm) 24971 1727096412.32946: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 24971 1727096412.32982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 24971 1727096412.33050: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 24971 1727096412.33120: Loaded config def from plugin (shell/cmd) 24971 1727096412.33122: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 24971 1727096412.33152: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 24971 1727096412.33219: Loaded config def from plugin (shell/powershell) 24971 1727096412.33221: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 24971 1727096412.33283: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 24971 1727096412.33463: Loaded config def from plugin (shell/sh) 24971 1727096412.33474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 24971 1727096412.33507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 24971 1727096412.33630: Loaded config def from plugin (become/runas) 24971 1727096412.33633: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 24971 1727096412.33852: Loaded config def from plugin (become/su) 24971 1727096412.33854: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 24971 1727096412.34013: Loaded config def from plugin (become/sudo) 24971 1727096412.34015: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 24971 1727096412.34046: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 24971 1727096412.34438: in VariableManager get_vars() 24971 1727096412.34460: done with get_vars() 24971 1727096412.34623: trying /usr/local/lib/python3.12/site-packages/ansible/modules 24971 1727096412.38761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 24971 1727096412.38903: in VariableManager get_vars() 24971 1727096412.38908: done with get_vars() 24971 1727096412.38911: variable 'playbook_dir' from source: magic vars 24971 1727096412.38912: variable 'ansible_playbook_python' from source: magic vars 24971 1727096412.38913: variable 'ansible_config_file' from source: magic vars 24971 1727096412.38913: variable 'groups' from source: magic vars 24971 1727096412.38914: variable 'omit' from source: magic vars 24971 1727096412.38915: variable 'ansible_version' from source: magic vars 24971 1727096412.38916: variable 'ansible_check_mode' from source: magic vars 24971 1727096412.38917: variable 'ansible_diff_mode' from source: magic vars 24971 1727096412.38918: variable 'ansible_forks' from source: magic vars 24971 1727096412.38918: variable 'ansible_inventory_sources' from source: magic vars 24971 1727096412.38926: variable 'ansible_skip_tags' from source: magic vars 24971 1727096412.38933: variable 'ansible_limit' from source: magic vars 24971 1727096412.38934: variable 'ansible_run_tags' from source: magic vars 24971 1727096412.38934: variable 'ansible_verbosity' from source: magic vars 24971 1727096412.38974: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 24971 1727096412.39541: in VariableManager get_vars() 24971 1727096412.39577: done with get_vars() 24971 1727096412.39616: in VariableManager get_vars() 24971 1727096412.39711: done with get_vars() 24971 1727096412.40026: in VariableManager get_vars() 24971 1727096412.40040: done with get_vars() 24971 1727096412.40044: variable 'omit' from source: magic vars 24971 1727096412.40063: variable 'omit' from source: magic vars 24971 1727096412.40108: in VariableManager get_vars() 24971 1727096412.40119: done with get_vars() 24971 1727096412.40164: in VariableManager get_vars() 24971 1727096412.40182: done with get_vars() 24971 1727096412.40222: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24971 1727096412.40473: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24971 1727096412.40603: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24971 1727096412.41332: in VariableManager get_vars() 24971 1727096412.41350: done with get_vars() 24971 1727096412.41933: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 24971 1727096412.42075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24971 1727096412.43659: in VariableManager get_vars() 24971 1727096412.43678: done with get_vars() 24971 1727096412.43725: in VariableManager get_vars() 24971 1727096412.43755: done with get_vars() 24971 1727096412.44452: in VariableManager get_vars() 24971 1727096412.44473: done with get_vars() 24971 1727096412.44479: variable 'omit' from source: magic vars 24971 1727096412.44491: variable 'omit' from source: magic vars 24971 1727096412.44522: in VariableManager get_vars() 24971 1727096412.44536: done with get_vars() 24971 1727096412.44555: in VariableManager get_vars() 24971 1727096412.44575: done with get_vars() 24971 1727096412.44604: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24971 1727096412.44720: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24971 1727096412.44909: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24971 1727096412.47818: in VariableManager get_vars() 24971 1727096412.47841: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24971 1727096412.49879: in VariableManager get_vars() 24971 1727096412.49902: done with get_vars() 24971 1727096412.50032: in VariableManager get_vars() 24971 1727096412.50051: done with get_vars() 24971 1727096412.50107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 24971 1727096412.50121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 24971 1727096412.50355: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 24971 1727096412.50515: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 24971 1727096412.50518: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 24971 1727096412.50548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 24971 1727096412.50579: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 24971 1727096412.50746: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 24971 1727096412.50811: Loaded config def from plugin (callback/default) 24971 1727096412.50814: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24971 1727096412.51957: Loaded config def from plugin (callback/junit) 24971 1727096412.51960: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24971 1727096412.52009: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 24971 1727096412.52075: Loaded config def from plugin (callback/minimal) 24971 1727096412.52077: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24971 1727096412.52115: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 24971 1727096412.52177: Loaded config def from plugin (callback/tree) 24971 1727096412.52180: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 24971 1727096412.52298: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 24971 1727096412.52301: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 24971 1727096412.52327: in VariableManager get_vars() 24971 1727096412.52339: done with get_vars() 24971 1727096412.52344: in VariableManager get_vars() 24971 1727096412.52352: done with get_vars() 24971 1727096412.52356: variable 'omit' from source: magic vars 24971 1727096412.52394: in VariableManager get_vars() 24971 1727096412.52408: done with get_vars() 24971 1727096412.52427: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 24971 1727096412.52973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 24971 1727096412.53045: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 24971 1727096412.53082: getting the remaining hosts for this loop 24971 1727096412.53084: done getting the remaining hosts for this loop 24971 1727096412.53087: getting the next task for host managed_node3 24971 1727096412.53090: done getting next task for host managed_node3 24971 1727096412.53092: ^ task is: TASK: Gathering Facts 24971 1727096412.53094: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096412.53096: getting variables 24971 1727096412.53097: in VariableManager get_vars() 24971 1727096412.53106: Calling all_inventory to load vars for managed_node3 24971 1727096412.53108: Calling groups_inventory to load vars for managed_node3 24971 1727096412.53111: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096412.53122: Calling all_plugins_play to load vars for managed_node3 24971 1727096412.53133: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096412.53136: Calling groups_plugins_play to load vars for managed_node3 24971 1727096412.53287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096412.53341: done with get_vars() 24971 1727096412.53347: done getting variables 24971 1727096412.53520: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Monday 23 September 2024 09:00:12 -0400 (0:00:00.013) 0:00:00.013 ****** 24971 1727096412.53541: entering _queue_task() for managed_node3/gather_facts 24971 1727096412.53542: Creating lock for gather_facts 24971 1727096412.54352: worker is 1 (out of 1 available) 24971 1727096412.54369: exiting _queue_task() for managed_node3/gather_facts 24971 1727096412.54382: done queuing things up, now waiting for results queue to drain 24971 1727096412.54384: waiting for pending results... 24971 1727096412.54583: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24971 1727096412.54747: in run() - task 0afff68d-5257-3482-6844-0000000000b9 24971 1727096412.54761: variable 'ansible_search_path' from source: unknown 24971 1727096412.54799: calling self._execute() 24971 1727096412.54854: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096412.54857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096412.54867: variable 'omit' from source: magic vars 24971 1727096412.55161: variable 'omit' from source: magic vars 24971 1727096412.55208: variable 'omit' from source: magic vars 24971 1727096412.55225: variable 'omit' from source: magic vars 24971 1727096412.55283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096412.55510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096412.55534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096412.55573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096412.55577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096412.55586: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096412.55589: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096412.55594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096412.55937: Set connection var ansible_shell_type to sh 24971 1727096412.55941: Set connection var ansible_shell_executable to /bin/sh 24971 1727096412.56041: Set connection var ansible_timeout to 10 24971 1727096412.56045: Set connection var ansible_connection to ssh 24971 1727096412.56048: Set connection var ansible_pipelining to False 24971 1727096412.56050: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096412.56052: variable 'ansible_shell_executable' from source: unknown 24971 1727096412.56054: variable 'ansible_connection' from source: unknown 24971 1727096412.56056: variable 'ansible_module_compression' from source: unknown 24971 1727096412.56058: variable 'ansible_shell_type' from source: unknown 24971 1727096412.56060: variable 'ansible_shell_executable' from source: unknown 24971 1727096412.56063: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096412.56065: variable 'ansible_pipelining' from source: unknown 24971 1727096412.56138: variable 'ansible_timeout' from source: unknown 24971 1727096412.56143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096412.56780: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096412.56976: variable 'omit' from source: magic vars 24971 1727096412.56980: starting attempt loop 24971 1727096412.56982: running the handler 24971 1727096412.56985: variable 'ansible_facts' from source: unknown 24971 1727096412.56989: _low_level_execute_command(): starting 24971 1727096412.56992: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096412.58997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096412.59002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096412.59012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096412.59073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096412.59207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096412.60880: stdout chunk (state=3): >>>/root <<< 24971 1727096412.61176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096412.61180: stdout chunk (state=3): >>><<< 24971 1727096412.61183: stderr chunk (state=3): >>><<< 24971 1727096412.61186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096412.61188: _low_level_execute_command(): starting 24971 1727096412.61192: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430 `" && echo ansible-tmp-1727096412.6109333-24991-220611245613430="` echo /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430 `" ) && sleep 0' 24971 1727096412.62677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096412.62681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096412.62729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096412.62748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096412.62788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096412.62945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096412.63060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096412.63111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096412.64989: stdout chunk (state=3): >>>ansible-tmp-1727096412.6109333-24991-220611245613430=/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430 <<< 24971 1727096412.65125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096412.65154: stderr chunk (state=3): >>><<< 24971 1727096412.65178: stdout chunk (state=3): >>><<< 24971 1727096412.65236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096412.6109333-24991-220611245613430=/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096412.65441: variable 'ansible_module_compression' from source: unknown 24971 1727096412.65446: ANSIBALLZ: Using generic lock for ansible.legacy.setup 24971 1727096412.65449: ANSIBALLZ: Acquiring lock 24971 1727096412.65451: ANSIBALLZ: Lock acquired: 139839577444416 24971 1727096412.65550: ANSIBALLZ: Creating module 24971 1727096413.08888: ANSIBALLZ: Writing module into payload 24971 1727096413.09049: ANSIBALLZ: Writing module 24971 1727096413.09085: ANSIBALLZ: Renaming module 24971 1727096413.09097: ANSIBALLZ: Done creating module 24971 1727096413.09139: variable 'ansible_facts' from source: unknown 24971 1727096413.09151: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096413.09164: _low_level_execute_command(): starting 24971 1727096413.09193: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 24971 1727096413.10308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096413.10407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096413.10429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096413.10500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096413.12197: stdout chunk (state=3): >>>PLATFORM <<< 24971 1727096413.12266: stdout chunk (state=3): >>>Linux <<< 24971 1727096413.12301: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 24971 1727096413.12471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096413.12496: stdout chunk (state=3): >>><<< 24971 1727096413.12499: stderr chunk (state=3): >>><<< 24971 1727096413.12647: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096413.12653 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 24971 1727096413.12657: _low_level_execute_command(): starting 24971 1727096413.12659: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 24971 1727096413.12800: Sending initial data 24971 1727096413.12803: Sent initial data (1181 bytes) 24971 1727096413.13498: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096413.13589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096413.13626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096413.13784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096413.17174: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 24971 1727096413.17586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096413.17672: stderr chunk (state=3): >>><<< 24971 1727096413.17676: stdout chunk (state=3): >>><<< 24971 1727096413.17678: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096413.17762: variable 'ansible_facts' from source: unknown 24971 1727096413.17796: variable 'ansible_facts' from source: unknown 24971 1727096413.17799: variable 'ansible_module_compression' from source: unknown 24971 1727096413.17842: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24971 1727096413.17905: variable 'ansible_facts' from source: unknown 24971 1727096413.18101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py 24971 1727096413.18374: Sending initial data 24971 1727096413.18377: Sent initial data (154 bytes) 24971 1727096413.18876: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096413.18880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096413.18885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096413.19019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096413.19022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096413.19025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096413.19073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096413.20685: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096413.20700: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24971 1727096413.20710: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 24971 1727096413.20726: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096413.20788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096413.20869: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp524if3kk /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py <<< 24971 1727096413.20873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py" <<< 24971 1727096413.20896: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp524if3kk" to remote "/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py" <<< 24971 1727096413.22419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096413.22459: stderr chunk (state=3): >>><<< 24971 1727096413.22463: stdout chunk (state=3): >>><<< 24971 1727096413.22481: done transferring module to remote 24971 1727096413.22500: _low_level_execute_command(): starting 24971 1727096413.22576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/ /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py && sleep 0' 24971 1727096413.23488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096413.23552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096413.23620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096413.23637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096413.23678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096413.23905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096413.25806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096413.25810: stdout chunk (state=3): >>><<< 24971 1727096413.25815: stderr chunk (state=3): >>><<< 24971 1727096413.25920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096413.25923: _low_level_execute_command(): starting 24971 1727096413.25925: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/AnsiballZ_setup.py && sleep 0' 24971 1727096413.27233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096413.27696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096413.27717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096413.27788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096413.30017: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24971 1727096413.30034: stdout chunk (state=3): >>>import _imp # builtin <<< 24971 1727096413.30088: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 24971 1727096413.30137: stdout chunk (state=3): >>>import '_io' # <<< 24971 1727096413.30148: stdout chunk (state=3): >>>import 'marshal' # <<< 24971 1727096413.30203: stdout chunk (state=3): >>>import 'posix' # <<< 24971 1727096413.30217: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24971 1727096413.30247: stdout chunk (state=3): >>>import 'time' # <<< 24971 1727096413.30251: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 24971 1727096413.30422: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24971 1727096413.30434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 24971 1727096413.30441: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14f104d0> <<< 24971 1727096413.30445: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14edfb30> <<< 24971 1727096413.30523: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14f12a50> import '_signal' # import '_abc' # import 'abc' # <<< 24971 1727096413.30541: stdout chunk (state=3): >>>import 'io' # <<< 24971 1727096413.30575: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 24971 1727096413.30662: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24971 1727096413.30690: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 24971 1727096413.30720: stdout chunk (state=3): >>>import 'os' # <<< 24971 1727096413.30746: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 24971 1727096413.30851: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 24971 1727096413.30854: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 24971 1727096413.30989: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ce5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ce5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24971 1727096413.31362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 24971 1727096413.31391: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 24971 1727096413.31418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 24971 1727096413.31482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 24971 1727096413.31516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 24971 1727096413.31523: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d23dd0> <<< 24971 1727096413.31541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 24971 1727096413.31555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 24971 1727096413.31582: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d23fe0> <<< 24971 1727096413.31617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 24971 1727096413.31632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 24971 1727096413.31684: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24971 1727096413.31725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.31760: stdout chunk (state=3): >>>import 'itertools' # <<< 24971 1727096413.31826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 24971 1727096413.31830: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d5b7a0> <<< 24971 1727096413.31832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 24971 1727096413.31834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 24971 1727096413.31836: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d5be30> <<< 24971 1727096413.31876: stdout chunk (state=3): >>>import '_collections' # <<< 24971 1727096413.31879: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d3baa0> <<< 24971 1727096413.31881: stdout chunk (state=3): >>>import '_functools' # <<< 24971 1727096413.31959: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d391c0> <<< 24971 1727096413.31997: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d20f80> <<< 24971 1727096413.32059: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 24971 1727096413.32083: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24971 1727096413.32158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24971 1727096413.32192: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d7b710> <<< 24971 1727096413.32198: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d7a330> <<< 24971 1727096413.32214: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d3a090> <<< 24971 1727096413.32294: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d78b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db0740> <<< 24971 1727096413.32302: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d20200> <<< 24971 1727096413.32421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14db0bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db0aa0> <<< 24971 1727096413.32425: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.32427: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14db0e90> <<< 24971 1727096413.32429: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d1ed20> <<< 24971 1727096413.32459: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 24971 1727096413.32464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.32642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db1580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db1250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db2480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24971 1727096413.32678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24971 1727096413.32802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 24971 1727096413.32807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dc8680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dc9d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 24971 1727096413.32813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 24971 1727096413.32833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 24971 1727096413.32843: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dcac00> <<< 24971 1727096413.32892: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dcb260> <<< 24971 1727096413.32898: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dca150> <<< 24971 1727096413.32944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24971 1727096413.33296: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dcbce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dcb410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14abfbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae8710> <<< 24971 1727096413.33299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24971 1727096413.33338: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.33517: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae9040> <<< 24971 1727096413.33618: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.33623: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae9a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae88f0> <<< 24971 1727096413.33641: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14abdd60> <<< 24971 1727096413.33651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 24971 1727096413.33740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14aeade0> <<< 24971 1727096413.33762: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae9b50> <<< 24971 1727096413.33765: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db2ba0> <<< 24971 1727096413.33791: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24971 1727096413.33934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 24971 1727096413.33950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b17140> <<< 24971 1727096413.34053: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 24971 1727096413.34056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24971 1727096413.34086: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b37500> <<< 24971 1727096413.34161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 24971 1727096413.34213: stdout chunk (state=3): >>>import 'ntpath' # <<< 24971 1727096413.34261: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.34316: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b982c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24971 1727096413.34395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24971 1727096413.34482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b9aa20> <<< 24971 1727096413.34803: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b983e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b612e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149a53d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b36300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14aebd10> <<< 24971 1727096413.34807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24971 1727096413.34809: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c14b36900> <<< 24971 1727096413.35099: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_48675wkb/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 24971 1727096413.35222: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.35254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 24971 1727096413.35270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24971 1727096413.35306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24971 1727096413.35385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 24971 1727096413.35421: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a0b0b0> <<< 24971 1727096413.35427: stdout chunk (state=3): >>>import '_typing' # <<< 24971 1727096413.35700: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149e9fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149e9130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.35704: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.35706: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 24971 1727096413.35772: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.37141: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.38307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a08f80> <<< 24971 1727096413.38311: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 24971 1727096413.38595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3a960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3a6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3a000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3aa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a0bd40> import 'atexit' # <<< 24971 1727096413.38599: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3b6b0> <<< 24971 1727096413.38643: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3b8f0> <<< 24971 1727096413.38650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24971 1727096413.38703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 24971 1727096413.38706: stdout chunk (state=3): >>>import '_locale' # <<< 24971 1727096413.38761: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3be30> <<< 24971 1727096413.38769: stdout chunk (state=3): >>>import 'pwd' # <<< 24971 1727096413.38790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 24971 1727096413.38817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24971 1727096413.38857: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1432dbe0> <<< 24971 1727096413.38918: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1432f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 24971 1727096413.38925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 24971 1727096413.38966: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143301d0> <<< 24971 1727096413.39004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24971 1727096413.39024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24971 1727096413.39130: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14331370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24971 1727096413.39177: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14333e30> <<< 24971 1727096413.39212: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3bef0> <<< 24971 1727096413.39242: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143320f0> <<< 24971 1727096413.39252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24971 1727096413.39297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 24971 1727096413.39304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 24971 1727096413.39329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24971 1727096413.39486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 24971 1727096413.39591: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433a510> <<< 24971 1727096413.39594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 24971 1727096413.39597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24971 1727096413.39679: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433aa80> <<< 24971 1727096413.39724: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14332600> <<< 24971 1727096413.39731: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.39760: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1437ffe0> <<< 24971 1727096413.39786: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14380110> <<< 24971 1727096413.39806: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 24971 1727096413.39895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14381bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14381970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 24971 1727096413.39976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 24971 1727096413.39982: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14384110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143822a0> <<< 24971 1727096413.40011: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 24971 1727096413.40051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.40083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 24971 1727096413.40131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 24971 1727096413.40136: stdout chunk (state=3): >>>import '_string' # <<< 24971 1727096413.40223: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14387860> <<< 24971 1727096413.40318: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14384230> <<< 24971 1727096413.40356: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.40362: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c143886b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14388890> <<< 24971 1727096413.40436: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.40446: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14388a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14380320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 24971 1727096413.40452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 24971 1727096413.40550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c142140e0> <<< 24971 1727096413.40709: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c142154c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1438a870> <<< 24971 1727096413.40723: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.40807: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1438bc20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1438a510> # zipimport: zlib available <<< 24971 1727096413.40810: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 24971 1727096413.40857: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.40965: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 24971 1727096413.40995: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.40998: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 24971 1727096413.41020: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.41244: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.42021: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.42385: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.42426: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14219610> <<< 24971 1727096413.42502: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24971 1727096413.42524: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421a480> <<< 24971 1727096413.42530: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae99d0> <<< 24971 1727096413.42582: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24971 1727096413.42588: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.42713: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 24971 1727096413.42780: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.42933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 24971 1727096413.42943: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421a3f0> <<< 24971 1727096413.43064: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.43404: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.43852: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.43934: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44011: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24971 1727096413.44015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44031: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44063: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24971 1727096413.44066: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44228: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44268: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 24971 1727096413.44379: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 24971 1727096413.44570: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.44799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 24971 1727096413.44864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 24971 1727096413.44979: stdout chunk (state=3): >>>import '_ast' # <<< 24971 1727096413.44989: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421b470> # zipimport: zlib available <<< 24971 1727096413.45021: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45135: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 24971 1727096413.45152: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45307: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 24971 1727096413.45319: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45362: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45412: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45533: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.45630: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24971 1727096413.45683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.45800: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14225f10> <<< 24971 1727096413.45803: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14223c80> <<< 24971 1727096413.46083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 24971 1727096413.46251: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1430e900> <<< 24971 1727096413.46275: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143fe5d0> <<< 24971 1727096413.46355: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14226030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142156d0> # destroy ansible.module_utils.distro <<< 24971 1727096413.46383: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 24971 1727096413.46390: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.46418: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 24971 1727096413.46424: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 24971 1727096413.46705: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.46747: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.46754: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.46841: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 24971 1727096413.46904: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.46987: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.46995: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.47042: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 24971 1727096413.47339: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.47387: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.47420: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.47487: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096413.47510: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 24971 1727096413.47587: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142ba030> <<< 24971 1727096413.47599: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 24971 1727096413.47617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 24971 1727096413.47623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24971 1727096413.47679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 24971 1727096413.47689: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 24971 1727096413.47711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee3f20> <<< 24971 1727096413.47755: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.47759: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.47770: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13ee8290> <<< 24971 1727096413.48084: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142a2e70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142baba0> <<< 24971 1727096413.48116: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b86e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b8b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13eeb1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eeaa80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13eeac60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee9eb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24971 1727096413.48287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eeb260> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.48292: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f51d60> <<< 24971 1727096413.48315: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eebd70> <<< 24971 1727096413.48343: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b83b0> <<< 24971 1727096413.48357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 24971 1727096413.48387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.48397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 24971 1727096413.48581: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.48614: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.48649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 24971 1727096413.48660: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 24971 1727096413.48688: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.48748: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 24971 1727096413.48792: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.48825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 24971 1727096413.48833: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.48974: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 24971 1727096413.49018: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.49039: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.49124: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.49155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 24971 1727096413.49164: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.49885: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 24971 1727096413.50103: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50125: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50181: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50213: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50279: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 24971 1727096413.50283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 24971 1727096413.50286: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50288: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50320: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 24971 1727096413.50327: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50470: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 24971 1727096413.50474: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50477: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 24971 1727096413.50548: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 24971 1727096413.50581: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50682: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.50750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 24971 1727096413.50894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f534d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24971 1727096413.50972: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f52690> <<< 24971 1727096413.50975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 24971 1727096413.50978: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.51182: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 24971 1727096413.51185: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.51504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 24971 1727096413.51584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24971 1727096413.51611: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096413.51682: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f89f70> <<< 24971 1727096413.52072: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f79cd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 24971 1727096413.52076: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.52078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 24971 1727096413.52080: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.52401: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.52405: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 24971 1727096413.52408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 24971 1727096413.52652: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f9dac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f9d7f0> <<< 24971 1727096413.52674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 24971 1727096413.52682: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.52689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 24971 1727096413.52695: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.52733: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.52999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 24971 1727096413.53017: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 24971 1727096413.53302: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.53325: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 24971 1727096413.53378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 24971 1727096413.53394: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53413: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53475: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53628: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.53783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 24971 1727096413.53829: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.54063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.54577: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.55074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 24971 1727096413.55172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 24971 1727096413.55176: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.55195: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.55290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 24971 1727096413.55297: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.55584: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 24971 1727096413.55644: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.56118: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56313: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 24971 1727096413.56603: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56625: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 24971 1727096413.56660: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.56728: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.57115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 24971 1727096413.57118: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.57120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 24971 1727096413.57716: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # <<< 24971 1727096413.57727: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.57730: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.57733: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24971 1727096413.57735: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.57983: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 24971 1727096413.58033: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 24971 1727096413.58148: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 24971 1727096413.58194: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.58243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 24971 1727096413.58263: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58287: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096413.58331: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58384: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58462: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 24971 1727096413.58534: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 24971 1727096413.58586: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.58639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 24971 1727096413.58905: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59024: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24971 1727096413.59039: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59070: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 24971 1727096413.59170: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 24971 1727096413.59233: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59341: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 24971 1727096413.59451: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59483: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.59587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 24971 1727096413.59669: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096413.60130: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 24971 1727096413.60160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24971 1727096413.60176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24971 1727096413.60258: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13da2570> <<< 24971 1727096413.60261: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13da1040> <<< 24971 1727096413.60263: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee97f0> <<< 24971 1727096413.71473: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 24971 1727096413.71478: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13deaf90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 24971 1727096413.71777: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13de8f50> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13deb290> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13de9f40> <<< 24971 1727096413.71781: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 24971 1727096413.95634: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "13", "epoch": "1727096413", "epoch_int": "1727096413", "date": "2024-09-23", "time": "09:00:13", "iso8601_micro": "2024-09-23T13:00:13.608272Z", "iso8601": "2024-09-23T13:00:13Z", "iso8601_basic": "20240923T090013608272", "iso8601_basic_short": "20240923T090013", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 <<< 24971 1727096413.95797: stdout chunk (state=3): >>>@ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2987, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 544, "free": 2987}, "nocache": {"free": 3304, "used": 227}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 556, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803384832, "block_size": 4096, "block_total": 65519099, "block_available": 63916842, "block_used": 1602257, "inode_total": 131070960, "inode_available": 131029181, "inode_used": 41779, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.3837890625, "5m": 0.49853515625, "15m": 0.31005859375}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24971 1727096413.96269: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 24971 1727096413.96352: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 24971 1727096413.96489: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd <<< 24971 1727096413.96528: stdout chunk (state=3): >>># destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local <<< 24971 1727096413.96569: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos <<< 24971 1727096413.96737: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi <<< 24971 1727096413.96752: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 24971 1727096413.97247: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 24971 1727096413.97265: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 24971 1727096413.97333: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 24971 1727096413.97338: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle <<< 24971 1727096413.97399: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 24971 1727096413.97403: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 24971 1727096413.97451: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 24971 1727096413.97507: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 24971 1727096413.97661: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 24971 1727096413.97771: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 24971 1727096413.97774: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 24971 1727096413.98089: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 24971 1727096413.98102: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24971 1727096413.98223: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 24971 1727096413.98230: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 24971 1727096413.98252: stdout chunk (state=3): >>># destroy _hashlib <<< 24971 1727096413.98271: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 24971 1727096413.98286: stdout chunk (state=3): >>># destroy itertools <<< 24971 1727096413.98321: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 24971 1727096413.98549: stdout chunk (state=3): >>># clear sys.audit hooks <<< 24971 1727096413.98773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096413.98776: stdout chunk (state=3): >>><<< 24971 1727096413.98778: stderr chunk (state=3): >>><<< 24971 1727096413.99192: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14f104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14edfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14f12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ce5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ce5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d23dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d23fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d5b7a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d5be30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d3baa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d391c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d20f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d7b710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d7a330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d3a090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d78b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db0740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d20200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14db0bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db0aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14db0e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14d1ed20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db1580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db1250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db2480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dc8680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dc9d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dcac00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dcb260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dca150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14dcbce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14dcb410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14abfbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae9040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14ae9a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae88f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14abdd60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14aeade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae9b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14db2ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b17140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b37500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b982c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b9aa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b983e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b612e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149a53d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14b36300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14aebd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c14b36900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_48675wkb/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a0b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149e9fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c149e9130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a08f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3a960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3a6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3a000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3aa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a0bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3b6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3b8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14a3be30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1432dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1432f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143301d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14331370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14333e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14a3bef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143320f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433a510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1433aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14332600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1437ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14380110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14381bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14381970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14384110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143822a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14387860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14384230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c143886b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14388890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14388a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14380320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c142140e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c142154c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1438a870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c1438bc20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1438a510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14219610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14ae99d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421a3f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1421b470> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c14225f10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14223c80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c1430e900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c143fe5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c14226030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142156d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142ba030> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee3f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13ee8290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142a2e70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142baba0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b86e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b8b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13eeb1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eeaa80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13eeac60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee9eb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eeb260> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f51d60> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13eebd70> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c142b83b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f534d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f52690> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f89f70> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f79cd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13f9dac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13f9d7f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c13da2570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13da1040> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13ee97f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13deaf90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13de8f50> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13deb290> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c13de9f40> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "13", "epoch": "1727096413", "epoch_int": "1727096413", "date": "2024-09-23", "time": "09:00:13", "iso8601_micro": "2024-09-23T13:00:13.608272Z", "iso8601": "2024-09-23T13:00:13Z", "iso8601_basic": "20240923T090013608272", "iso8601_basic_short": "20240923T090013", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2987, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 544, "free": 2987}, "nocache": {"free": 3304, "used": 227}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 556, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803384832, "block_size": 4096, "block_total": 65519099, "block_available": 63916842, "block_used": 1602257, "inode_total": 131070960, "inode_available": 131029181, "inode_used": 41779, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.3837890625, "5m": 0.49853515625, "15m": 0.31005859375}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 24971 1727096414.02183: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096414.02215: _low_level_execute_command(): starting 24971 1727096414.02225: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096412.6109333-24991-220611245613430/ > /dev/null 2>&1 && sleep 0' 24971 1727096414.03448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096414.03584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.03772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096414.03776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096414.03791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.03854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.05776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.05780: stdout chunk (state=3): >>><<< 24971 1727096414.05783: stderr chunk (state=3): >>><<< 24971 1727096414.05825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096414.05841: handler run complete 24971 1727096414.06275: variable 'ansible_facts' from source: unknown 24971 1727096414.06777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.07338: variable 'ansible_facts' from source: unknown 24971 1727096414.07447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.07589: attempt loop complete, returning result 24971 1727096414.07599: _execute() done 24971 1727096414.07606: dumping result to json 24971 1727096414.07639: done dumping result, returning 24971 1727096414.07660: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-3482-6844-0000000000b9] 24971 1727096414.07673: sending task result for task 0afff68d-5257-3482-6844-0000000000b9 24971 1727096414.08553: done sending task result for task 0afff68d-5257-3482-6844-0000000000b9 ok: [managed_node3] 24971 1727096414.08564: WORKER PROCESS EXITING 24971 1727096414.08657: no more pending results, returning what we have 24971 1727096414.08662: results queue empty 24971 1727096414.08663: checking for any_errors_fatal 24971 1727096414.08664: done checking for any_errors_fatal 24971 1727096414.08664: checking for max_fail_percentage 24971 1727096414.08665: done checking for max_fail_percentage 24971 1727096414.08666: checking to see if all hosts have failed and the running result is not ok 24971 1727096414.08666: done checking to see if all hosts have failed 24971 1727096414.08668: getting the remaining hosts for this loop 24971 1727096414.08670: done getting the remaining hosts for this loop 24971 1727096414.08672: getting the next task for host managed_node3 24971 1727096414.08676: done getting next task for host managed_node3 24971 1727096414.08677: ^ task is: TASK: meta (flush_handlers) 24971 1727096414.08679: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096414.08685: getting variables 24971 1727096414.08686: in VariableManager get_vars() 24971 1727096414.08707: Calling all_inventory to load vars for managed_node3 24971 1727096414.08710: Calling groups_inventory to load vars for managed_node3 24971 1727096414.08712: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.08719: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.08720: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.08722: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.08915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.09170: done with get_vars() 24971 1727096414.09380: done getting variables 24971 1727096414.09444: in VariableManager get_vars() 24971 1727096414.09454: Calling all_inventory to load vars for managed_node3 24971 1727096414.09456: Calling groups_inventory to load vars for managed_node3 24971 1727096414.09458: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.09462: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.09465: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.09478: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.09748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.10043: done with get_vars() 24971 1727096414.10056: done queuing things up, now waiting for results queue to drain 24971 1727096414.10057: results queue empty 24971 1727096414.10058: checking for any_errors_fatal 24971 1727096414.10060: done checking for any_errors_fatal 24971 1727096414.10061: checking for max_fail_percentage 24971 1727096414.10062: done checking for max_fail_percentage 24971 1727096414.10063: checking to see if all hosts have failed and the running result is not ok 24971 1727096414.10102: done checking to see if all hosts have failed 24971 1727096414.10103: getting the remaining hosts for this loop 24971 1727096414.10105: done getting the remaining hosts for this loop 24971 1727096414.10119: getting the next task for host managed_node3 24971 1727096414.10126: done getting next task for host managed_node3 24971 1727096414.10129: ^ task is: TASK: Include the task 'el_repo_setup.yml' 24971 1727096414.10130: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096414.10133: getting variables 24971 1727096414.10134: in VariableManager get_vars() 24971 1727096414.10169: Calling all_inventory to load vars for managed_node3 24971 1727096414.10172: Calling groups_inventory to load vars for managed_node3 24971 1727096414.10177: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.10183: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.10186: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.10188: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.10290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.10432: done with get_vars() 24971 1727096414.10442: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Monday 23 September 2024 09:00:14 -0400 (0:00:01.569) 0:00:01.582 ****** 24971 1727096414.10517: entering _queue_task() for managed_node3/include_tasks 24971 1727096414.10518: Creating lock for include_tasks 24971 1727096414.10822: worker is 1 (out of 1 available) 24971 1727096414.10833: exiting _queue_task() for managed_node3/include_tasks 24971 1727096414.10844: done queuing things up, now waiting for results queue to drain 24971 1727096414.10846: waiting for pending results... 24971 1727096414.11090: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 24971 1727096414.11179: in run() - task 0afff68d-5257-3482-6844-000000000006 24971 1727096414.11194: variable 'ansible_search_path' from source: unknown 24971 1727096414.11245: calling self._execute() 24971 1727096414.11288: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.11293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096414.11375: variable 'omit' from source: magic vars 24971 1727096414.11457: _execute() done 24971 1727096414.11463: dumping result to json 24971 1727096414.11474: done dumping result, returning 24971 1727096414.11484: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-3482-6844-000000000006] 24971 1727096414.11494: sending task result for task 0afff68d-5257-3482-6844-000000000006 24971 1727096414.11795: done sending task result for task 0afff68d-5257-3482-6844-000000000006 24971 1727096414.11799: WORKER PROCESS EXITING 24971 1727096414.11834: no more pending results, returning what we have 24971 1727096414.11838: in VariableManager get_vars() 24971 1727096414.11876: Calling all_inventory to load vars for managed_node3 24971 1727096414.11879: Calling groups_inventory to load vars for managed_node3 24971 1727096414.11882: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.11891: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.11893: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.11896: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.12122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.12402: done with get_vars() 24971 1727096414.12410: variable 'ansible_search_path' from source: unknown 24971 1727096414.12422: we have included files to process 24971 1727096414.12424: generating all_blocks data 24971 1727096414.12425: done generating all_blocks data 24971 1727096414.12426: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24971 1727096414.12427: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24971 1727096414.12429: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24971 1727096414.13247: in VariableManager get_vars() 24971 1727096414.13262: done with get_vars() 24971 1727096414.13276: done processing included file 24971 1727096414.13278: iterating over new_blocks loaded from include file 24971 1727096414.13295: in VariableManager get_vars() 24971 1727096414.13306: done with get_vars() 24971 1727096414.13307: filtering new block on tags 24971 1727096414.13321: done filtering new block on tags 24971 1727096414.13324: in VariableManager get_vars() 24971 1727096414.13341: done with get_vars() 24971 1727096414.13343: filtering new block on tags 24971 1727096414.13358: done filtering new block on tags 24971 1727096414.13361: in VariableManager get_vars() 24971 1727096414.13373: done with get_vars() 24971 1727096414.13374: filtering new block on tags 24971 1727096414.13387: done filtering new block on tags 24971 1727096414.13389: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 24971 1727096414.13394: extending task lists for all hosts with included blocks 24971 1727096414.13438: done extending task lists 24971 1727096414.13439: done processing included files 24971 1727096414.13440: results queue empty 24971 1727096414.13441: checking for any_errors_fatal 24971 1727096414.13450: done checking for any_errors_fatal 24971 1727096414.13451: checking for max_fail_percentage 24971 1727096414.13452: done checking for max_fail_percentage 24971 1727096414.13453: checking to see if all hosts have failed and the running result is not ok 24971 1727096414.13453: done checking to see if all hosts have failed 24971 1727096414.13454: getting the remaining hosts for this loop 24971 1727096414.13456: done getting the remaining hosts for this loop 24971 1727096414.13458: getting the next task for host managed_node3 24971 1727096414.13462: done getting next task for host managed_node3 24971 1727096414.13464: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 24971 1727096414.13466: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096414.13470: getting variables 24971 1727096414.13471: in VariableManager get_vars() 24971 1727096414.13480: Calling all_inventory to load vars for managed_node3 24971 1727096414.13482: Calling groups_inventory to load vars for managed_node3 24971 1727096414.13484: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.13489: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.13491: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.13494: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.13631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.13828: done with get_vars() 24971 1727096414.13837: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 09:00:14 -0400 (0:00:00.033) 0:00:01.616 ****** 24971 1727096414.13912: entering _queue_task() for managed_node3/setup 24971 1727096414.14323: worker is 1 (out of 1 available) 24971 1727096414.14332: exiting _queue_task() for managed_node3/setup 24971 1727096414.14343: done queuing things up, now waiting for results queue to drain 24971 1727096414.14344: waiting for pending results... 24971 1727096414.14447: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 24971 1727096414.14565: in run() - task 0afff68d-5257-3482-6844-0000000000ca 24971 1727096414.14631: variable 'ansible_search_path' from source: unknown 24971 1727096414.14638: variable 'ansible_search_path' from source: unknown 24971 1727096414.14716: calling self._execute() 24971 1727096414.14795: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.14806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096414.14820: variable 'omit' from source: magic vars 24971 1727096414.15983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096414.18392: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096414.18469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096414.18521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096414.18559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096414.18592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096414.18687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096414.18729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096414.18757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096414.18807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096414.18832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096414.19038: variable 'ansible_facts' from source: unknown 24971 1727096414.19092: variable 'network_test_required_facts' from source: task vars 24971 1727096414.19144: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 24971 1727096414.19216: variable 'omit' from source: magic vars 24971 1727096414.19220: variable 'omit' from source: magic vars 24971 1727096414.19235: variable 'omit' from source: magic vars 24971 1727096414.19271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096414.19303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096414.19330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096414.19352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096414.19377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096414.19414: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096414.19423: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.19436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096414.19542: Set connection var ansible_shell_type to sh 24971 1727096414.19583: Set connection var ansible_shell_executable to /bin/sh 24971 1727096414.19586: Set connection var ansible_timeout to 10 24971 1727096414.19589: Set connection var ansible_connection to ssh 24971 1727096414.19592: Set connection var ansible_pipelining to False 24971 1727096414.19600: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096414.19623: variable 'ansible_shell_executable' from source: unknown 24971 1727096414.19650: variable 'ansible_connection' from source: unknown 24971 1727096414.19652: variable 'ansible_module_compression' from source: unknown 24971 1727096414.19655: variable 'ansible_shell_type' from source: unknown 24971 1727096414.19656: variable 'ansible_shell_executable' from source: unknown 24971 1727096414.19659: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.19661: variable 'ansible_pipelining' from source: unknown 24971 1727096414.19662: variable 'ansible_timeout' from source: unknown 24971 1727096414.19691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096414.19820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096414.19866: variable 'omit' from source: magic vars 24971 1727096414.19871: starting attempt loop 24971 1727096414.19873: running the handler 24971 1727096414.19876: _low_level_execute_command(): starting 24971 1727096414.19878: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096414.20642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.20709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096414.20758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.20841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.22515: stdout chunk (state=3): >>>/root <<< 24971 1727096414.22705: stdout chunk (state=3): >>><<< 24971 1727096414.22709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.22712: stderr chunk (state=3): >>><<< 24971 1727096414.22714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096414.22732: _low_level_execute_command(): starting 24971 1727096414.22744: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774 `" && echo ansible-tmp-1727096414.22689-25051-247577177057774="` echo /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774 `" ) && sleep 0' 24971 1727096414.23914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096414.23923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096414.23940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096414.24134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096414.24191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.24320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.26107: stdout chunk (state=3): >>>ansible-tmp-1727096414.22689-25051-247577177057774=/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774 <<< 24971 1727096414.26239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.26243: stdout chunk (state=3): >>><<< 24971 1727096414.26251: stderr chunk (state=3): >>><<< 24971 1727096414.26271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096414.22689-25051-247577177057774=/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096414.26370: variable 'ansible_module_compression' from source: unknown 24971 1727096414.26421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24971 1727096414.26872: variable 'ansible_facts' from source: unknown 24971 1727096414.27100: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py 24971 1727096414.27349: Sending initial data 24971 1727096414.27353: Sent initial data (152 bytes) 24971 1727096414.28647: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.28806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096414.28815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.28878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.30502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096414.30588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py" <<< 24971 1727096414.30592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpievmilgf /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py <<< 24971 1727096414.30615: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpievmilgf" to remote "/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py" <<< 24971 1727096414.30619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py" <<< 24971 1727096414.33003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.33008: stdout chunk (state=3): >>><<< 24971 1727096414.33015: stderr chunk (state=3): >>><<< 24971 1727096414.33036: done transferring module to remote 24971 1727096414.33049: _low_level_execute_command(): starting 24971 1727096414.33053: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/ /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py && sleep 0' 24971 1727096414.34320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096414.34350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096414.34362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096414.34397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096414.34427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096414.34543: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096414.34546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.34549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096414.34551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096414.34553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096414.34556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096414.34824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.34827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096414.34991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096414.35009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.35078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.37650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.37654: stdout chunk (state=3): >>><<< 24971 1727096414.37656: stderr chunk (state=3): >>><<< 24971 1727096414.37659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096414.37661: _low_level_execute_command(): starting 24971 1727096414.37663: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/AnsiballZ_setup.py && sleep 0' 24971 1727096414.38715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096414.38719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096414.38722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.38724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096414.38726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.38772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096414.38987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.39050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096414.41674: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 24971 1727096414.41695: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 24971 1727096414.41993: stdout chunk (state=3): >>>import '_io' # <<< 24971 1727096414.42011: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419e7b30> <<< 24971 1727096414.42034: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 24971 1727096414.42052: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141a1aa50> <<< 24971 1727096414.42098: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 24971 1727096414.42135: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 24971 1727096414.42302: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 24971 1727096414.42328: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 24971 1727096414.42448: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414182d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 24971 1727096414.42451: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414182dfa0> <<< 24971 1727096414.42750: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24971 1727096414.42881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 24971 1727096414.42992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 24971 1727096414.42996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 24971 1727096414.43031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414186be90> <<< 24971 1727096414.43052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 24971 1727096414.43305: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414186bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24971 1727096414.43398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.43401: stdout chunk (state=3): >>>import 'itertools' # <<< 24971 1727096414.43404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 24971 1727096414.43406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418a3ec0> <<< 24971 1727096414.43408: stdout chunk (state=3): >>>import '_collections' # <<< 24971 1727096414.43410: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141883b60> <<< 24971 1727096414.43412: stdout chunk (state=3): >>>import '_functools' # <<< 24971 1727096414.43456: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141881280> <<< 24971 1727096414.43459: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141869040> <<< 24971 1727096414.43524: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24971 1727096414.43527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 24971 1727096414.43589: stdout chunk (state=3): >>>import '_sre' # <<< 24971 1727096414.43676: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24971 1727096414.43679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c23f0> <<< 24971 1727096414.43748: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141882150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c0c20> <<< 24971 1727096414.43848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24971 1727096414.43890: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41418f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41418f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141866de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 24971 1727096414.44135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 24971 1727096414.44156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141910710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141911df0> <<< 24971 1727096414.44198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 24971 1727096414.44209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 24971 1727096414.44260: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141912c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.44291: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41419132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419121e0> <<< 24971 1727096414.44306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 24971 1727096414.44337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.44361: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141913d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419134a0> <<< 24971 1727096414.44409: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fa540> <<< 24971 1727096414.44473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24971 1727096414.44501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 24971 1727096414.44522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 24971 1727096414.44574: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f414162fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 24971 1727096414.44598: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41416586e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141658440> <<< 24971 1727096414.44666: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141658710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24971 1727096414.44756: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.44933: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141659040> <<< 24971 1727096414.45107: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41416599a0> <<< 24971 1727096414.45124: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416588f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414162dd90> <<< 24971 1727096414.45152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 24971 1727096414.45197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 24971 1727096414.45209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 24971 1727096414.45231: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414165adb0> <<< 24971 1727096414.45261: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141659af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fac30> <<< 24971 1727096414.45287: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24971 1727096414.45390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 24971 1727096414.45430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 24971 1727096414.45470: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141687110> <<< 24971 1727096414.45560: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.45576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24971 1727096414.45603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24971 1727096414.45729: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416a74a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24971 1727096414.45800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 24971 1727096414.45894: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141708260> <<< 24971 1727096414.45924: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24971 1727096414.46203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414170a9c0> <<< 24971 1727096414.46246: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141708380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416d1280> <<< 24971 1727096414.46276: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141519340> <<< 24971 1727096414.46455: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416a62a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414165bce0> <<< 24971 1727096414.46575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24971 1727096414.46598: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f41415195b0> <<< 24971 1727096414.46978: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_bs65rtvw/ansible_setup_payload.zip' # zipimport: zlib available <<< 24971 1727096414.47090: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.47168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24971 1727096414.47248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24971 1727096414.47391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415830b0> import '_typing' # <<< 24971 1727096414.47506: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141561fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141561160> # zipimport: zlib available <<< 24971 1727096414.47529: stdout chunk (state=3): >>>import 'ansible' # <<< 24971 1727096414.47708: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.47712: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.47983: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 24971 1727096414.49088: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.50360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 24971 1727096414.50400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141580f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 24971 1727096414.50404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.50530: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2750> <<< 24971 1727096414.50565: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2060> <<< 24971 1727096414.50581: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 24971 1727096414.50644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141583ad0> <<< 24971 1727096414.50675: stdout chunk (state=3): >>>import 'atexit' # <<< 24971 1727096414.50686: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b3740> <<< 24971 1727096414.50728: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24971 1727096414.50786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 24971 1727096414.50873: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b3ec0> import 'pwd' # <<< 24971 1727096414.50896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 24971 1727096414.50922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24971 1727096414.51074: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f2db50> <<< 24971 1727096414.51124: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f2f7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f301a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24971 1727096414.51160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f31340> <<< 24971 1727096414.51402: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f33d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f30110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f32060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24971 1727096414.51434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 24971 1727096414.51691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 24971 1727096414.51695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 24971 1727096414.51723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3baa0><<< 24971 1727096414.51745: stdout chunk (state=3): >>> import '_tokenize' # <<< 24971 1727096414.51859: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a570> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a2d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 24971 1727096414.51935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24971 1727096414.52031: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a840> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f32540> <<< 24971 1727096414.52035: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f7fd70> <<< 24971 1727096414.52089: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f7fda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 24971 1727096414.52105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 24971 1727096414.52146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 24971 1727096414.52175: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.52184: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f81910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f816d0> <<< 24971 1727096414.52233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 24971 1727096414.52391: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f83e60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f82000> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.52409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 24971 1727096414.52481: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f87620> <<< 24971 1727096414.52676: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f83fb0> <<< 24971 1727096414.52740: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f883e0> <<< 24971 1727096414.52836: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f885c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.52859: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f88920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f802c0> <<< 24971 1727096414.52888: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 24971 1727096414.52926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 24971 1727096414.52948: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.53084: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f8bf80> <<< 24971 1727096414.53219: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.53222: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e14f50> <<< 24971 1727096414.53259: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8a750> <<< 24971 1727096414.53302: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f8baa0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8a390> # zipimport: zlib available <<< 24971 1727096414.53331: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 24971 1727096414.53449: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.53707: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 24971 1727096414.53718: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.54047: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.54381: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.55056: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 24971 1727096414.55085: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.55201: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e19160> <<< 24971 1727096414.55253: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 24971 1727096414.55261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24971 1727096414.55284: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e19e80> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8b5f0> <<< 24971 1727096414.55352: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 24971 1727096414.55384: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 24971 1727096414.55402: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.55640: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.55855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 24971 1727096414.55875: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e19ee0> # zipimport: zlib available <<< 24971 1727096414.56606: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57334: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57442: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57518: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24971 1727096414.57584: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.57623: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24971 1727096414.57626: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57728: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57851: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24971 1727096414.57880: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 24971 1727096414.57942: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57947: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.57992: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 24971 1727096414.58011: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.58366: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.58803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 24971 1727096414.58944: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e1b0b0> # zipimport: zlib available <<< 24971 1727096414.59015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59122: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 24971 1727096414.59131: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 24971 1727096414.59137: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 24971 1727096414.59159: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59208: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59257: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 24971 1727096414.59320: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.59381: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59461: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59561: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24971 1727096414.59608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.59792: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e25d30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e20a40> <<< 24971 1727096414.59810: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 24971 1727096414.59828: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59904: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.59994: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60072: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60082: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.60124: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24971 1727096414.60144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24971 1727096414.60224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 24971 1727096414.60266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24971 1727096414.60343: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f0e750> <<< 24971 1727096414.60452: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415de420> <<< 24971 1727096414.60515: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e25f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e25bb0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 24971 1727096414.60520: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60563: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60646: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 24971 1727096414.60662: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 24971 1727096414.60683: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60692: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 24971 1727096414.60706: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60786: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.60890: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.60911: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61018: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.61060: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 24971 1727096414.61119: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61231: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61327: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61402: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 24971 1727096414.61408: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61679: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.61981: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.62054: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096414.62080: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 24971 1727096414.62091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 24971 1727096414.62114: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 24971 1727096414.62188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb5af0> <<< 24971 1727096414.62203: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 24971 1727096414.62224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24971 1727096414.62290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 24971 1727096414.62322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 24971 1727096414.62366: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a63e00> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.62378: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a683e0> <<< 24971 1727096414.62442: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e9c8c0> <<< 24971 1727096414.62517: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb41d0> <<< 24971 1727096414.62535: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb7c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 24971 1727096414.62620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 24971 1727096414.62651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 24971 1727096414.62661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 24971 1727096414.62715: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a6b0b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6a960> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a6ab10> <<< 24971 1727096414.62762: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a69d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24971 1727096414.62912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 24971 1727096414.62942: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6b1a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 24971 1727096414.62991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 24971 1727096414.63051: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140ac9ca0> <<< 24971 1727096414.63055: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6bc80> <<< 24971 1727096414.63085: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb7e60> import 'ansible.module_utils.facts.timeout' # <<< 24971 1727096414.63106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 24971 1727096414.63126: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 24971 1727096414.63230: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.63301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 24971 1727096414.63315: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.63384: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.63469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 24971 1727096414.63487: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 24971 1727096414.63646: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.63692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 24971 1727096414.63826: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 24971 1727096414.63892: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.63958: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.64028: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.64109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 24971 1727096414.64140: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.64820: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.65530: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 24971 1727096414.65642: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.65675: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.65716: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.65774: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 24971 1727096414.65778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 24971 1727096414.65826: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.65855: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 24971 1727096414.65878: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66024: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 24971 1727096414.66027: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66060: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 24971 1727096414.66139: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66189: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 24971 1727096414.66192: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66347: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 24971 1727096414.66481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 24971 1727096414.66510: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140acbe30> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 24971 1727096414.66513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24971 1727096414.66693: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140aca750> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 24971 1727096414.66834: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.66871: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 24971 1727096414.66892: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 24971 1727096414.67144: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67231: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 24971 1727096414.67347: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67391: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.67453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 24971 1727096414.67513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24971 1727096414.67601: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.67689: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140b05fa0> <<< 24971 1727096414.67983: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140acb560> <<< 24971 1727096414.67988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 24971 1727096414.68037: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68069: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 24971 1727096414.68151: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68266: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68388: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68791: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 24971 1727096414.68800: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68837: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.68896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 24971 1727096414.68949: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.69014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 24971 1727096414.69048: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096414.69100: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140b19d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140b1b740> import 'ansible.module_utils.facts.system.user' # <<< 24971 1727096414.69120: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 24971 1727096414.69169: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.69231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 24971 1727096414.69466: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.69686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 24971 1727096414.69769: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.69934: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.69975: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.70032: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.70094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 24971 1727096414.70098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 24971 1727096414.70116: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.70147: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.70346: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.70553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 24971 1727096414.70647: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.70737: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.71010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.71014: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.71897: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.72675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 24971 1727096414.72682: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 24971 1727096414.72836: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.72989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 24971 1727096414.72992: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73134: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 24971 1727096414.73291: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73508: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 24971 1727096414.73760: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73769: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 24971 1727096414.73787: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73836: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.73895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 24971 1727096414.73902: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74045: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74182: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74741: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 24971 1727096414.74809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 24971 1727096414.74819: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74864: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 24971 1727096414.74918: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74946: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.74974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 24971 1727096414.74984: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75083: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 24971 1727096414.75189: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75208: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 24971 1727096414.75250: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75330: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 24971 1727096414.75411: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75483: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 24971 1727096414.75737: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.75980: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 24971 1727096414.76384: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76459: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24971 1727096414.76551: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76595: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 24971 1727096414.76687: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.76720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 24971 1727096414.76742: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76773: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.76822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 24971 1727096414.77033: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # <<< 24971 1727096414.77060: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77075: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 24971 1727096414.77139: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 24971 1727096414.77207: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77223: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77253: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77309: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77381: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77475: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.77593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 24971 1727096414.77718: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 24971 1727096414.77730: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78032: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24971 1727096414.78346: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78392: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 24971 1727096414.78521: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096414.78588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 24971 1727096414.78593: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78707: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.78819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 24971 1727096414.78827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 24971 1727096414.78837: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.79087: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # <<< 24971 1727096414.79097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 24971 1727096414.79186: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096414.79481: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 24971 1727096414.79512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24971 1727096414.79523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24971 1727096414.79561: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140917440> <<< 24971 1727096414.79580: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140916510> <<< 24971 1727096414.79649: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140914da0> <<< 24971 1727096414.81118: stdout chunk (state=3): >>> <<< 24971 1727096414.81151: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZ<<< 24971 1727096414.81377: stdout chunk (state=3): >>>FoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "14", "epoch": "1727096414", "epoch_int": "1727096414", "date": "2024-09-23", "time": "09:00:14", "iso8601_micro": "2024-09-23T13:00:14.801774Z", "iso8601": "2024-09-23T13:00:14Z", "iso8601_basic": "20240923T090014801774", "iso8601_basic_short": "20240923T090014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24971 1727096414.82029: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 24971 1727096414.82035: stdout chunk (state=3): >>> <<< 24971 1727096414.82061: stdout chunk (state=3): >>># clear sys.path_hooks<<< 24971 1727096414.82077: stdout chunk (state=3): >>> <<< 24971 1727096414.82084: stdout chunk (state=3): >>># clear builtins._ <<< 24971 1727096414.82106: stdout chunk (state=3): >>># clear sys.path<<< 24971 1727096414.82113: stdout chunk (state=3): >>> # clear sys.argv <<< 24971 1727096414.82136: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2<<< 24971 1727096414.82147: stdout chunk (state=3): >>> <<< 24971 1727096414.82179: stdout chunk (state=3): >>># clear sys.last_exc<<< 24971 1727096414.82183: stdout chunk (state=3): >>> # clear sys.last_type<<< 24971 1727096414.82189: stdout chunk (state=3): >>> # clear sys.last_value<<< 24971 1727096414.82203: stdout chunk (state=3): >>> # clear sys.last_traceback<<< 24971 1727096414.82232: stdout chunk (state=3): >>> # clear sys.__interactivehook__<<< 24971 1727096414.82236: stdout chunk (state=3): >>> # clear sys.meta_path<<< 24971 1727096414.82248: stdout chunk (state=3): >>> # restore sys.stdin # restore sys.stdout # restore sys.stderr<<< 24971 1727096414.82271: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 24971 1727096414.82293: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 24971 1727096414.82321: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator <<< 24971 1727096414.82342: stdout chunk (state=3): >>># cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools<<< 24971 1727096414.82369: stdout chunk (state=3): >>> # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external<<< 24971 1727096414.82390: stdout chunk (state=3): >>> # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression<<< 24971 1727096414.82415: stdout chunk (state=3): >>> # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 24971 1727096414.82440: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath<<< 24971 1727096414.82460: stdout chunk (state=3): >>> # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile<<< 24971 1727096414.82484: stdout chunk (state=3): >>> # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 24971 1727096414.82509: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 24971 1727096414.82539: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 24971 1727096414.82570: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 24971 1727096414.82584: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 24971 1727096414.82606: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters<<< 24971 1727096414.82671: stdout chunk (state=3): >>> # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy<<< 24971 1727096414.82694: stdout chunk (state=3): >>> # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 24971 1727096414.82715: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec<<< 24971 1727096414.82740: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process<<< 24971 1727096414.82965: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansi<<< 24971 1727096414.82977: stdout chunk (state=3): >>>ble.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 24971 1727096414.83411: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24971 1727096414.83444: stdout chunk (state=3): >>># destroy importlib.machinery <<< 24971 1727096414.83467: stdout chunk (state=3): >>># destroy importlib._abc <<< 24971 1727096414.83473: stdout chunk (state=3): >>># destroy importlib.util<<< 24971 1727096414.83486: stdout chunk (state=3): >>> <<< 24971 1727096414.83511: stdout chunk (state=3): >>># destroy _bz2 <<< 24971 1727096414.83539: stdout chunk (state=3): >>># destroy _compression <<< 24971 1727096414.83555: stdout chunk (state=3): >>># destroy _lzma <<< 24971 1727096414.83580: stdout chunk (state=3): >>># destroy _blake2 <<< 24971 1727096414.83611: stdout chunk (state=3): >>># destroy binascii <<< 24971 1727096414.83640: stdout chunk (state=3): >>># destroy zlib <<< 24971 1727096414.83680: stdout chunk (state=3): >>># destroy bz2 # destroy lzma<<< 24971 1727096414.83687: stdout chunk (state=3): >>> # destroy zipfile._path<<< 24971 1727096414.83695: stdout chunk (state=3): >>> # destroy zipfile <<< 24971 1727096414.83709: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 24971 1727096414.83762: stdout chunk (state=3): >>> # destroy ntpath <<< 24971 1727096414.83790: stdout chunk (state=3): >>># destroy importlib <<< 24971 1727096414.83806: stdout chunk (state=3): >>># destroy zipimport<<< 24971 1727096414.83812: stdout chunk (state=3): >>> <<< 24971 1727096414.83840: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 24971 1727096414.83855: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 24971 1727096414.83881: stdout chunk (state=3): >>> # destroy _json # destroy grp<<< 24971 1727096414.83920: stdout chunk (state=3): >>> # destroy encodings # destroy _locale # destroy locale <<< 24971 1727096414.83943: stdout chunk (state=3): >>># destroy select # destroy _signal # destroy _posixsubprocess<<< 24971 1727096414.83974: stdout chunk (state=3): >>> # destroy syslog<<< 24971 1727096414.83977: stdout chunk (state=3): >>> # destroy uuid<<< 24971 1727096414.84025: stdout chunk (state=3): >>> # destroy selinux<<< 24971 1727096414.84037: stdout chunk (state=3): >>> <<< 24971 1727096414.84048: stdout chunk (state=3): >>># destroy shutil<<< 24971 1727096414.84054: stdout chunk (state=3): >>> <<< 24971 1727096414.84085: stdout chunk (state=3): >>># destroy distro<<< 24971 1727096414.84096: stdout chunk (state=3): >>> <<< 24971 1727096414.84106: stdout chunk (state=3): >>># destroy distro.distro <<< 24971 1727096414.84161: stdout chunk (state=3): >>># destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors<<< 24971 1727096414.84175: stdout chunk (state=3): >>> <<< 24971 1727096414.84189: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 24971 1727096414.84210: stdout chunk (state=3): >>># destroy multiprocessing <<< 24971 1727096414.84214: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool<<< 24971 1727096414.84232: stdout chunk (state=3): >>> # destroy signal # destroy pickle <<< 24971 1727096414.84270: stdout chunk (state=3): >>># destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle<<< 24971 1727096414.84280: stdout chunk (state=3): >>> <<< 24971 1727096414.84284: stdout chunk (state=3): >>># destroy queue<<< 24971 1727096414.84312: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue # destroy multiprocessing.process<<< 24971 1727096414.84326: stdout chunk (state=3): >>> # destroy unicodedata<<< 24971 1727096414.84346: stdout chunk (state=3): >>> # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing<<< 24971 1727096414.84382: stdout chunk (state=3): >>> # destroy shlex<<< 24971 1727096414.84388: stdout chunk (state=3): >>> <<< 24971 1727096414.84410: stdout chunk (state=3): >>># destroy fcntl # destroy datetime<<< 24971 1727096414.84416: stdout chunk (state=3): >>> <<< 24971 1727096414.84434: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 24971 1727096414.84469: stdout chunk (state=3): >>> # destroy _ssl <<< 24971 1727096414.84502: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 24971 1727096414.84523: stdout chunk (state=3): >>> # destroy getpass<<< 24971 1727096414.84539: stdout chunk (state=3): >>> # destroy pwd # destroy termios<<< 24971 1727096414.84564: stdout chunk (state=3): >>> # destroy errno<<< 24971 1727096414.84570: stdout chunk (state=3): >>> <<< 24971 1727096414.84611: stdout chunk (state=3): >>># destroy json # destroy socket<<< 24971 1727096414.84615: stdout chunk (state=3): >>> <<< 24971 1727096414.84640: stdout chunk (state=3): >>># destroy struct<<< 24971 1727096414.84651: stdout chunk (state=3): >>> # destroy glob <<< 24971 1727096414.84707: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 24971 1727096414.84753: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 24971 1727096414.84777: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 24971 1727096414.84786: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 24971 1727096414.84809: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 24971 1727096414.84822: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 24971 1727096414.84844: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 24971 1727096414.84865: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 24971 1727096414.84882: stdout chunk (state=3): >>> # cleanup[3] wiping _typing<<< 24971 1727096414.84897: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 24971 1727096414.84909: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 24971 1727096414.84927: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 24971 1727096414.84950: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 24971 1727096414.84964: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 24971 1727096414.84991: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 24971 1727096414.85007: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 24971 1727096414.85014: stdout chunk (state=3): >>> # cleanup[3] wiping collections<<< 24971 1727096414.85027: stdout chunk (state=3): >>> # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools<<< 24971 1727096414.85050: stdout chunk (state=3): >>> # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 24971 1727096414.85063: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath<<< 24971 1727096414.85089: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io<<< 24971 1727096414.85112: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 24971 1727096414.85120: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 24971 1727096414.85134: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 24971 1727096414.85156: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 24971 1727096414.85173: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 24971 1727096414.85193: stdout chunk (state=3): >>> # destroy selinux._selinux<<< 24971 1727096414.85199: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader<<< 24971 1727096414.85340: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime <<< 24971 1727096414.85486: stdout chunk (state=3): >>># destroy sys.monitoring <<< 24971 1727096414.85513: stdout chunk (state=3): >>># destroy _socket <<< 24971 1727096414.85584: stdout chunk (state=3): >>># destroy _collections <<< 24971 1727096414.85610: stdout chunk (state=3): >>># destroy platform # destroy _uuid<<< 24971 1727096414.85666: stdout chunk (state=3): >>> # destroy stat # destroy genericpath <<< 24971 1727096414.85690: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize<<< 24971 1727096414.85749: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib <<< 24971 1727096414.85752: stdout chunk (state=3): >>># destroy copyreg<<< 24971 1727096414.85818: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 24971 1727096414.85821: stdout chunk (state=3): >>># destroy _tokenize <<< 24971 1727096414.85916: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 24971 1727096414.86022: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 24971 1727096414.86264: stdout chunk (state=3): >>> # destroy codecs<<< 24971 1727096414.86271: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 24971 1727096414.86300: stdout chunk (state=3): >>> # destroy _random # destroy _weakref <<< 24971 1727096414.86351: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator<<< 24971 1727096414.86393: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools<<< 24971 1727096414.86421: stdout chunk (state=3): >>> # destroy _abc # destroy posix<<< 24971 1727096414.86442: stdout chunk (state=3): >>> # destroy _functools # destroy builtins # destroy _thread<<< 24971 1727096414.86486: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 24971 1727096414.86543: stdout chunk (state=3): >>> <<< 24971 1727096414.86999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096414.87092: stderr chunk (state=3): >>><<< 24971 1727096414.87095: stdout chunk (state=3): >>><<< 24971 1727096414.87300: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414182d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414182dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414186be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414186bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141883b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141881280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141869040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141882150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41418f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41418f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141866de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141910710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141911df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141912c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41419132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141913d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41419134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f414162fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41416586e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141658440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141658710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4141659040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41416599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416588f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414162dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414165adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141659af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41418fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141687110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416a74a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141708260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414170a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141708380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416d1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141519340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41416a62a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f414165bce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f41415195b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_bs65rtvw/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415830b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141561fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141561160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141580f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4141583ad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41415b3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415b3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f2db50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f2f7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f301a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f31340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f33d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f30110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f32060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3baa0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a570> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a2d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f3a840> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f32540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f7fd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f7fda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f81910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f816d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f83e60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f82000> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f87620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f83fb0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f883e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f885c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f88920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f802c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f8bf80> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e14f50> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140f8baa0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8a390> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e19160> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e19e80> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f8b5f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e19ee0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e1b0b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140e25d30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e20a40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140f0e750> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41415de420> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e25f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e25bb0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb5af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a63e00> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a683e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140e9c8c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb41d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb7c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a6b0b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6a960> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140a6ab10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a69d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6b1a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140ac9ca0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140a6bc80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140eb7e60> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140acbe30> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140aca750> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140b05fa0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140acb560> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140b19d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140b1b740> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4140917440> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140916510> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4140914da0> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "14", "epoch": "1727096414", "epoch_int": "1727096414", "date": "2024-09-23", "time": "09:00:14", "iso8601_micro": "2024-09-23T13:00:14.801774Z", "iso8601": "2024-09-23T13:00:14Z", "iso8601_basic": "20240923T090014801774", "iso8601_basic_short": "20240923T090014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24971 1727096414.88470: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096414.88474: _low_level_execute_command(): starting 24971 1727096414.88476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096414.22689-25051-247577177057774/ > /dev/null 2>&1 && sleep 0' 24971 1727096414.88594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096414.88597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.88600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096414.88602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096414.88660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096414.88662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096414.88698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096414.91174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096414.91179: stdout chunk (state=3): >>><<< 24971 1727096414.91181: stderr chunk (state=3): >>><<< 24971 1727096414.91216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096414.91279: handler run complete 24971 1727096414.91389: variable 'ansible_facts' from source: unknown 24971 1727096414.91457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.91976: variable 'ansible_facts' from source: unknown 24971 1727096414.92097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.92149: attempt loop complete, returning result 24971 1727096414.92152: _execute() done 24971 1727096414.92155: dumping result to json 24971 1727096414.92166: done dumping result, returning 24971 1727096414.92384: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-3482-6844-0000000000ca] 24971 1727096414.92388: sending task result for task 0afff68d-5257-3482-6844-0000000000ca 24971 1727096414.92586: done sending task result for task 0afff68d-5257-3482-6844-0000000000ca 24971 1727096414.92589: WORKER PROCESS EXITING ok: [managed_node3] 24971 1727096414.92707: no more pending results, returning what we have 24971 1727096414.92711: results queue empty 24971 1727096414.92711: checking for any_errors_fatal 24971 1727096414.92713: done checking for any_errors_fatal 24971 1727096414.92714: checking for max_fail_percentage 24971 1727096414.92715: done checking for max_fail_percentage 24971 1727096414.92716: checking to see if all hosts have failed and the running result is not ok 24971 1727096414.92717: done checking to see if all hosts have failed 24971 1727096414.92718: getting the remaining hosts for this loop 24971 1727096414.92719: done getting the remaining hosts for this loop 24971 1727096414.92723: getting the next task for host managed_node3 24971 1727096414.92731: done getting next task for host managed_node3 24971 1727096414.92733: ^ task is: TASK: Check if system is ostree 24971 1727096414.92736: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096414.92740: getting variables 24971 1727096414.92742: in VariableManager get_vars() 24971 1727096414.92874: Calling all_inventory to load vars for managed_node3 24971 1727096414.92877: Calling groups_inventory to load vars for managed_node3 24971 1727096414.92883: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096414.92895: Calling all_plugins_play to load vars for managed_node3 24971 1727096414.92898: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096414.92901: Calling groups_plugins_play to load vars for managed_node3 24971 1727096414.93058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096414.93649: done with get_vars() 24971 1727096414.93660: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 09:00:14 -0400 (0:00:00.800) 0:00:02.417 ****** 24971 1727096414.93955: entering _queue_task() for managed_node3/stat 24971 1727096414.94608: worker is 1 (out of 1 available) 24971 1727096414.94618: exiting _queue_task() for managed_node3/stat 24971 1727096414.94627: done queuing things up, now waiting for results queue to drain 24971 1727096414.94628: waiting for pending results... 24971 1727096414.95299: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 24971 1727096414.95647: in run() - task 0afff68d-5257-3482-6844-0000000000cc 24971 1727096414.95651: variable 'ansible_search_path' from source: unknown 24971 1727096414.95976: variable 'ansible_search_path' from source: unknown 24971 1727096414.96386: calling self._execute() 24971 1727096414.96389: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.96392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096414.96395: variable 'omit' from source: magic vars 24971 1727096414.97722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096414.98311: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096414.98482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096414.98520: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096414.98582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096414.98816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096414.98846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096414.98975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096414.99016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096414.99173: Evaluated conditional (not __network_is_ostree is defined): True 24971 1727096414.99327: variable 'omit' from source: magic vars 24971 1727096414.99375: variable 'omit' from source: magic vars 24971 1727096414.99420: variable 'omit' from source: magic vars 24971 1727096414.99654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096414.99657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096414.99660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096414.99662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096414.99664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096414.99764: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096414.99782: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096414.99969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.00029: Set connection var ansible_shell_type to sh 24971 1727096415.00042: Set connection var ansible_shell_executable to /bin/sh 24971 1727096415.00087: Set connection var ansible_timeout to 10 24971 1727096415.00097: Set connection var ansible_connection to ssh 24971 1727096415.00177: Set connection var ansible_pipelining to False 24971 1727096415.00193: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096415.00230: variable 'ansible_shell_executable' from source: unknown 24971 1727096415.00238: variable 'ansible_connection' from source: unknown 24971 1727096415.00249: variable 'ansible_module_compression' from source: unknown 24971 1727096415.00301: variable 'ansible_shell_type' from source: unknown 24971 1727096415.00304: variable 'ansible_shell_executable' from source: unknown 24971 1727096415.00306: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.00308: variable 'ansible_pipelining' from source: unknown 24971 1727096415.00410: variable 'ansible_timeout' from source: unknown 24971 1727096415.00414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.00628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096415.00632: variable 'omit' from source: magic vars 24971 1727096415.00746: starting attempt loop 24971 1727096415.00753: running the handler 24971 1727096415.00773: _low_level_execute_command(): starting 24971 1727096415.00787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096415.02729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096415.03041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096415.03173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096415.03200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.03330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096415.05158: stdout chunk (state=3): >>>/root <<< 24971 1727096415.05162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096415.05165: stdout chunk (state=3): >>><<< 24971 1727096415.05168: stderr chunk (state=3): >>><<< 24971 1727096415.05172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096415.05182: _low_level_execute_command(): starting 24971 1727096415.05184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804 `" && echo ansible-tmp-1727096415.0510218-25091-148353253629804="` echo /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804 `" ) && sleep 0' 24971 1727096415.06432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096415.06677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096415.06682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096415.06739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096415.06748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.06811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096415.08880: stdout chunk (state=3): >>>ansible-tmp-1727096415.0510218-25091-148353253629804=/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804 <<< 24971 1727096415.09062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096415.09065: stdout chunk (state=3): >>><<< 24971 1727096415.09069: stderr chunk (state=3): >>><<< 24971 1727096415.09072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096415.0510218-25091-148353253629804=/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096415.09074: variable 'ansible_module_compression' from source: unknown 24971 1727096415.09228: ANSIBALLZ: Using lock for stat 24971 1727096415.09329: ANSIBALLZ: Acquiring lock 24971 1727096415.09332: ANSIBALLZ: Lock acquired: 139839577444896 24971 1727096415.09335: ANSIBALLZ: Creating module 24971 1727096415.34611: ANSIBALLZ: Writing module into payload 24971 1727096415.34740: ANSIBALLZ: Writing module 24971 1727096415.34772: ANSIBALLZ: Renaming module 24971 1727096415.34785: ANSIBALLZ: Done creating module 24971 1727096415.34805: variable 'ansible_facts' from source: unknown 24971 1727096415.34897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py 24971 1727096415.35096: Sending initial data 24971 1727096415.35099: Sent initial data (153 bytes) 24971 1727096415.36186: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096415.36301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096415.36305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096415.36435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.36508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096415.38669: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096415.38722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096415.38780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp46bsdn7v /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py <<< 24971 1727096415.38794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py" <<< 24971 1727096415.38827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp46bsdn7v" to remote "/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py" <<< 24971 1727096415.39795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096415.39837: stderr chunk (state=3): >>><<< 24971 1727096415.39840: stdout chunk (state=3): >>><<< 24971 1727096415.40036: done transferring module to remote 24971 1727096415.40039: _low_level_execute_command(): starting 24971 1727096415.40041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/ /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py && sleep 0' 24971 1727096415.40990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096415.41011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096415.41036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096415.41140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096415.41183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.41212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096415.43718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096415.43730: stdout chunk (state=3): >>><<< 24971 1727096415.43757: stderr chunk (state=3): >>><<< 24971 1727096415.43795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096415.43862: _low_level_execute_command(): starting 24971 1727096415.43875: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/AnsiballZ_stat.py && sleep 0' 24971 1727096415.45011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096415.45032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096415.45053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096415.45078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096415.45245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096415.45387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.45458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096415.48783: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 24971 1727096415.48854: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 24971 1727096415.48888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.48913: stdout chunk (state=3): >>>import '_codecs' # <<< 24971 1727096415.48937: stdout chunk (state=3): >>>import 'codecs' # <<< 24971 1727096415.48999: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24971 1727096415.49030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 24971 1727096415.49149: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58eaa50> import '_signal' # import '_abc' # import 'abc' # <<< 24971 1727096415.49182: stdout chunk (state=3): >>>import 'io' # <<< 24971 1727096415.49252: stdout chunk (state=3): >>>import '_stat' # <<< 24971 1727096415.49280: stdout chunk (state=3): >>>import 'stat' # <<< 24971 1727096415.49493: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24971 1727096415.49523: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 24971 1727096415.49543: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 24971 1727096415.49803: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5699130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5699fa0> import 'site' # <<< 24971 1727096415.49807: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24971 1727096415.50181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 24971 1727096415.50198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 24971 1727096415.50223: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 24971 1727096415.50240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.50279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 24971 1727096415.50342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 24971 1727096415.50362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 24971 1727096415.50542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d7f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24971 1727096415.50614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.50640: stdout chunk (state=3): >>>import 'itertools' # <<< 24971 1727096415.50669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 24971 1727096415.50691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d570f890> <<< 24971 1727096415.50715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 24971 1727096415.50741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 24971 1727096415.50761: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d570ff20> <<< 24971 1727096415.50785: stdout chunk (state=3): >>>import '_collections' # <<< 24971 1727096415.50841: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56efb30> <<< 24971 1727096415.50864: stdout chunk (state=3): >>>import '_functools' # <<< 24971 1727096415.50903: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56ed250> <<< 24971 1727096415.51040: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d5010> <<< 24971 1727096415.51079: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24971 1727096415.51107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 24971 1727096415.51125: stdout chunk (state=3): >>>import '_sre' # <<< 24971 1727096415.51160: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24971 1727096415.51194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24971 1727096415.51224: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 24971 1727096415.51346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56ee120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572ccb0> <<< 24971 1727096415.51405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 24971 1727096415.51429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 24971 1727096415.51432: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5764860> <<< 24971 1727096415.51454: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d4290> <<< 24971 1727096415.51480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 24971 1727096415.51500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24971 1727096415.51526: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.51540: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5764d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5764bc0> <<< 24971 1727096415.51593: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.51613: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5764fb0> <<< 24971 1727096415.51634: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d2db0> <<< 24971 1727096415.51848: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d57656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5765370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d57665a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24971 1727096415.51918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24971 1727096415.51922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 24971 1727096415.51956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577c7a0> <<< 24971 1727096415.51959: stdout chunk (state=3): >>>import 'errno' # <<< 24971 1727096415.52006: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.52009: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577de80> <<< 24971 1727096415.52040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 24971 1727096415.52059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 24971 1727096415.52100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 24971 1727096415.52129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 24971 1727096415.52141: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577ed20> <<< 24971 1727096415.52175: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.52216: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577f320> <<< 24971 1727096415.52219: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577e270> <<< 24971 1727096415.52258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 24971 1727096415.52261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24971 1727096415.52312: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.52348: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577fda0> <<< 24971 1727096415.52351: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577f4d0> <<< 24971 1727096415.52400: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5766510> <<< 24971 1727096415.52439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24971 1727096415.52475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24971 1727096415.52589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.52678: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d550fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 24971 1727096415.52713: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.52973: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55386b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5538410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55386e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.53016: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5539010> <<< 24971 1727096415.53198: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.53202: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55399d0> <<< 24971 1727096415.53248: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55388c0> <<< 24971 1727096415.53251: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d550dd90> <<< 24971 1727096415.53275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 24971 1727096415.53303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 24971 1727096415.53376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 24971 1727096415.53442: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d553ad20> <<< 24971 1727096415.53470: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5538e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5766750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24971 1727096415.53681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.53684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 24971 1727096415.53686: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5567080> <<< 24971 1727096415.53747: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 24971 1727096415.53807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24971 1727096415.53904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24971 1727096415.54125: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5587440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # <<< 24971 1727096415.54158: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55e8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24971 1727096415.54171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 24971 1727096415.54195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24971 1727096415.54264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24971 1727096415.54388: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55ea9c0> <<< 24971 1727096415.54700: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55e8380> <<< 24971 1727096415.54714: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55b5250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f25340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5586240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d553bc50> <<< 24971 1727096415.54816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24971 1727096415.54850: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f80d5586840> <<< 24971 1727096415.55060: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_1rjdnsuy/ansible_stat_payload.zip' <<< 24971 1727096415.55077: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.55448: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24971 1727096415.55481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 24971 1727096415.55508: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 24971 1727096415.55528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f7afc0> <<< 24971 1727096415.55548: stdout chunk (state=3): >>>import '_typing' # <<< 24971 1727096415.55818: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f59eb0> <<< 24971 1727096415.55837: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f59070> <<< 24971 1727096415.55857: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.55883: stdout chunk (state=3): >>>import 'ansible' # <<< 24971 1727096415.55923: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.55944: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.55964: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.55981: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 24971 1727096415.56007: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.58158: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.59988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 24971 1727096415.60007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 24971 1727096415.60013: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f78e90> <<< 24971 1727096415.60037: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 24971 1727096415.60249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa6900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa6690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa5fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 24971 1727096415.60266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 24971 1727096415.60314: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa63f0> <<< 24971 1727096415.60326: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f7bc50> <<< 24971 1727096415.60448: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa76b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa78f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24971 1727096415.60484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 24971 1727096415.60508: stdout chunk (state=3): >>>import '_locale' # <<< 24971 1727096415.60573: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa7e30> <<< 24971 1727096415.60594: stdout chunk (state=3): >>>import 'pwd' # <<< 24971 1727096415.60619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 24971 1727096415.60663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24971 1727096415.60711: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e0dbe0> <<< 24971 1727096415.60752: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.60767: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e0f800> <<< 24971 1727096415.60793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 24971 1727096415.60817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 24971 1727096415.60877: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e101a0> <<< 24971 1727096415.60898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24971 1727096415.60942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24971 1727096415.60970: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e11340> <<< 24971 1727096415.61025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 24971 1727096415.61054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 24971 1727096415.61107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 24971 1727096415.61119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24971 1727096415.61375: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e13da0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55e81d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e12090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24971 1727096415.61424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 24971 1727096415.61428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 24971 1727096415.61430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 24971 1727096415.61460: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1bbc0> import '_tokenize' # <<< 24971 1727096415.61565: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a420> <<< 24971 1727096415.61641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24971 1727096415.61857: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a960> <<< 24971 1727096415.61893: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e12570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e63e60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e63ec0> <<< 24971 1727096415.61910: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 24971 1727096415.61943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 24971 1727096415.62002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 24971 1727096415.62025: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62275: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e659a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e65760> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 24971 1727096415.62283: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62454: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e67ec0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e66060> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 24971 1727096415.62483: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6b6b0> <<< 24971 1727096415.62675: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e67f80> <<< 24971 1727096415.62759: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62777: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62779: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6c4a0> <<< 24971 1727096415.62815: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62826: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62833: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6c7d0> <<< 24971 1727096415.62896: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.62902: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6ca10> <<< 24971 1727096415.62932: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e640e0> <<< 24971 1727096415.62957: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 24971 1727096415.62977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 24971 1727096415.63000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 24971 1727096415.63040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 24971 1727096415.63076: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.63115: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4ef8140> <<< 24971 1727096415.63355: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.63377: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.63382: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4ef94c0> <<< 24971 1727096415.63398: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6e8d0> <<< 24971 1727096415.63440: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.63443: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 24971 1727096415.63463: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6fc80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6e510> <<< 24971 1727096415.63487: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63505: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63524: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 24971 1727096415.63550: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63682: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63800: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63825: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63836: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 24971 1727096415.63871: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63881: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.63898: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 24971 1727096415.63914: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.64099: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.64282: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.65178: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.66050: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 24971 1727096415.66078: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 24971 1727096415.66084: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 24971 1727096415.66101: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 24971 1727096415.66131: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 24971 1727096415.66161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.66340: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4efd6a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24971 1727096415.66373: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efe480> <<< 24971 1727096415.66385: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4ef9760> <<< 24971 1727096415.66451: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24971 1727096415.66474: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.66506: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.66527: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 24971 1727096415.66554: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.66784: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.67037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 24971 1727096415.67055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 24971 1727096415.67071: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efe570> <<< 24971 1727096415.67139: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.67841: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.68551: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.68654: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.68756: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24971 1727096415.68778: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.68830: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.68918: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 24971 1727096415.68993: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69072: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24971 1727096415.69077: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69113: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 24971 1727096415.69117: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69166: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69195: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 24971 1727096415.69206: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69427: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.69670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 24971 1727096415.69956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4eff680> # zipimport: zlib available # zipimport: zlib available <<< 24971 1727096415.70048: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 24971 1727096415.70052: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 24971 1727096415.70075: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 24971 1727096415.70083: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70138: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70191: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 24971 1727096415.70197: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70252: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70341: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70452: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.70480: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24971 1727096415.70539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.70662: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4d09fa0> <<< 24971 1727096415.70706: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4d05970> <<< 24971 1727096415.70753: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 24971 1727096415.70927: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24971 1727096415.70958: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.71014: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 24971 1727096415.71053: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 24971 1727096415.71063: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24971 1727096415.71093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24971 1727096415.71172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 24971 1727096415.71196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 24971 1727096415.71227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24971 1727096415.71342: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4ffea20> <<< 24971 1727096415.71410: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fee6f0> <<< 24971 1727096415.71491: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4d0a150> <<< 24971 1727096415.71495: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efefc0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 24971 1727096415.71533: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.71645: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 24971 1727096415.71653: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.71674: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.71677: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 24971 1727096415.71689: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.71894: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.72190: stdout chunk (state=3): >>># zipimport: zlib available <<< 24971 1727096415.72335: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 24971 1727096415.72445: stdout chunk (state=3): >>># destroy __main__ <<< 24971 1727096415.72894: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 24971 1727096415.72898: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack <<< 24971 1727096415.72925: stdout chunk (state=3): >>># cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 24971 1727096415.73196: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 24971 1727096415.73418: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 24971 1727096415.73439: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil <<< 24971 1727096415.73453: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 24971 1727096415.73496: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 24971 1727096415.73539: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 24971 1727096415.73562: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 24971 1727096415.73595: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 24971 1727096415.73680: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24971 1727096415.73903: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 24971 1727096415.73906: stdout chunk (state=3): >>> <<< 24971 1727096415.73980: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 24971 1727096415.74193: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24971 1727096415.74515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096415.74519: stdout chunk (state=3): >>><<< 24971 1727096415.74521: stderr chunk (state=3): >>><<< 24971 1727096415.74745: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d58eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5699130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5699fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d7f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d570f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d570ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56efb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56ed250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d5010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56ee120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d572ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5764860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d4290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5764d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5764bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5764fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d56d2db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d57656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5765370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d57665a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577de80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577ed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d577fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d577f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5766510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d550fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55386b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5538410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55386e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d5539010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55399d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55388c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d550dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d553ad20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5538e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5766750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5567080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5587440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55e8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55ea9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55e8380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d55b5250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f25340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d5586240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d553bc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f80d5586840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_1rjdnsuy/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f7afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f59eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f59070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f78e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa6900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa6690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa5fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa63f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4f7bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa76b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4fa78f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fa7e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e0dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e0f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e101a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e11340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e13da0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d55e81d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e12090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1bbc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a420> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e1a960> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e12570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e63e60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e63ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e659a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e65760> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e67ec0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e66060> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6b6b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e67f80> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6c4a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6c7d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6ca10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e640e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4ef8140> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4ef94c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6e8d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4e6fc80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4e6e510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4efd6a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efe480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4ef9760> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efe570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4eff680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f80d4d09fa0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4d05970> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4ffea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4fee6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4d0a150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f80d4efefc0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24971 1727096415.75918: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096415.75922: _low_level_execute_command(): starting 24971 1727096415.75925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096415.0510218-25091-148353253629804/ > /dev/null 2>&1 && sleep 0' 24971 1727096415.76238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096415.76241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096415.76245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096415.76247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096415.76565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096415.76577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096415.78706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096415.78737: stderr chunk (state=3): >>><<< 24971 1727096415.78740: stdout chunk (state=3): >>><<< 24971 1727096415.78751: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096415.78758: handler run complete 24971 1727096415.78779: attempt loop complete, returning result 24971 1727096415.78782: _execute() done 24971 1727096415.78785: dumping result to json 24971 1727096415.78787: done dumping result, returning 24971 1727096415.78794: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0afff68d-5257-3482-6844-0000000000cc] 24971 1727096415.78798: sending task result for task 0afff68d-5257-3482-6844-0000000000cc 24971 1727096415.78881: done sending task result for task 0afff68d-5257-3482-6844-0000000000cc 24971 1727096415.78884: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 24971 1727096415.78942: no more pending results, returning what we have 24971 1727096415.78945: results queue empty 24971 1727096415.78946: checking for any_errors_fatal 24971 1727096415.78952: done checking for any_errors_fatal 24971 1727096415.78952: checking for max_fail_percentage 24971 1727096415.78954: done checking for max_fail_percentage 24971 1727096415.78954: checking to see if all hosts have failed and the running result is not ok 24971 1727096415.78955: done checking to see if all hosts have failed 24971 1727096415.78956: getting the remaining hosts for this loop 24971 1727096415.78957: done getting the remaining hosts for this loop 24971 1727096415.78961: getting the next task for host managed_node3 24971 1727096415.78966: done getting next task for host managed_node3 24971 1727096415.78970: ^ task is: TASK: Set flag to indicate system is ostree 24971 1727096415.78973: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096415.78976: getting variables 24971 1727096415.78978: in VariableManager get_vars() 24971 1727096415.79007: Calling all_inventory to load vars for managed_node3 24971 1727096415.79009: Calling groups_inventory to load vars for managed_node3 24971 1727096415.79013: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096415.79023: Calling all_plugins_play to load vars for managed_node3 24971 1727096415.79025: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096415.79028: Calling groups_plugins_play to load vars for managed_node3 24971 1727096415.79215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096415.79399: done with get_vars() 24971 1727096415.79411: done getting variables 24971 1727096415.79533: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 09:00:15 -0400 (0:00:00.856) 0:00:03.273 ****** 24971 1727096415.79561: entering _queue_task() for managed_node3/set_fact 24971 1727096415.79563: Creating lock for set_fact 24971 1727096415.80192: worker is 1 (out of 1 available) 24971 1727096415.80204: exiting _queue_task() for managed_node3/set_fact 24971 1727096415.80214: done queuing things up, now waiting for results queue to drain 24971 1727096415.80215: waiting for pending results... 24971 1727096415.80461: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 24971 1727096415.80603: in run() - task 0afff68d-5257-3482-6844-0000000000cd 24971 1727096415.80620: variable 'ansible_search_path' from source: unknown 24971 1727096415.80627: variable 'ansible_search_path' from source: unknown 24971 1727096415.80674: calling self._execute() 24971 1727096415.80760: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.80782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.80799: variable 'omit' from source: magic vars 24971 1727096415.81200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096415.81386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096415.81417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096415.81448: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096415.81498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096415.81560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096415.81585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096415.81603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096415.81620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096415.81716: Evaluated conditional (not __network_is_ostree is defined): True 24971 1727096415.81719: variable 'omit' from source: magic vars 24971 1727096415.81741: variable 'omit' from source: magic vars 24971 1727096415.81826: variable '__ostree_booted_stat' from source: set_fact 24971 1727096415.81864: variable 'omit' from source: magic vars 24971 1727096415.81887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096415.81911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096415.81922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096415.81935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096415.81944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096415.81966: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096415.81978: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.81981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.82046: Set connection var ansible_shell_type to sh 24971 1727096415.82053: Set connection var ansible_shell_executable to /bin/sh 24971 1727096415.82061: Set connection var ansible_timeout to 10 24971 1727096415.82066: Set connection var ansible_connection to ssh 24971 1727096415.82077: Set connection var ansible_pipelining to False 24971 1727096415.82079: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096415.82095: variable 'ansible_shell_executable' from source: unknown 24971 1727096415.82098: variable 'ansible_connection' from source: unknown 24971 1727096415.82100: variable 'ansible_module_compression' from source: unknown 24971 1727096415.82103: variable 'ansible_shell_type' from source: unknown 24971 1727096415.82105: variable 'ansible_shell_executable' from source: unknown 24971 1727096415.82107: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.82109: variable 'ansible_pipelining' from source: unknown 24971 1727096415.82112: variable 'ansible_timeout' from source: unknown 24971 1727096415.82116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.82187: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096415.82192: variable 'omit' from source: magic vars 24971 1727096415.82198: starting attempt loop 24971 1727096415.82201: running the handler 24971 1727096415.82209: handler run complete 24971 1727096415.82217: attempt loop complete, returning result 24971 1727096415.82219: _execute() done 24971 1727096415.82222: dumping result to json 24971 1727096415.82224: done dumping result, returning 24971 1727096415.82235: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0afff68d-5257-3482-6844-0000000000cd] 24971 1727096415.82237: sending task result for task 0afff68d-5257-3482-6844-0000000000cd 24971 1727096415.82312: done sending task result for task 0afff68d-5257-3482-6844-0000000000cd 24971 1727096415.82315: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 24971 1727096415.82390: no more pending results, returning what we have 24971 1727096415.82393: results queue empty 24971 1727096415.82393: checking for any_errors_fatal 24971 1727096415.82401: done checking for any_errors_fatal 24971 1727096415.82405: checking for max_fail_percentage 24971 1727096415.82406: done checking for max_fail_percentage 24971 1727096415.82407: checking to see if all hosts have failed and the running result is not ok 24971 1727096415.82408: done checking to see if all hosts have failed 24971 1727096415.82409: getting the remaining hosts for this loop 24971 1727096415.82410: done getting the remaining hosts for this loop 24971 1727096415.82414: getting the next task for host managed_node3 24971 1727096415.82421: done getting next task for host managed_node3 24971 1727096415.82425: ^ task is: TASK: Fix CentOS6 Base repo 24971 1727096415.82427: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096415.82431: getting variables 24971 1727096415.82432: in VariableManager get_vars() 24971 1727096415.82455: Calling all_inventory to load vars for managed_node3 24971 1727096415.82457: Calling groups_inventory to load vars for managed_node3 24971 1727096415.82460: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096415.82472: Calling all_plugins_play to load vars for managed_node3 24971 1727096415.82475: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096415.82483: Calling groups_plugins_play to load vars for managed_node3 24971 1727096415.82773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096415.83392: done with get_vars() 24971 1727096415.83440: done getting variables 24971 1727096415.83695: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 09:00:15 -0400 (0:00:00.041) 0:00:03.315 ****** 24971 1727096415.83726: entering _queue_task() for managed_node3/copy 24971 1727096415.84479: worker is 1 (out of 1 available) 24971 1727096415.84490: exiting _queue_task() for managed_node3/copy 24971 1727096415.84507: done queuing things up, now waiting for results queue to drain 24971 1727096415.84508: waiting for pending results... 24971 1727096415.84627: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 24971 1727096415.85079: in run() - task 0afff68d-5257-3482-6844-0000000000cf 24971 1727096415.85083: variable 'ansible_search_path' from source: unknown 24971 1727096415.85086: variable 'ansible_search_path' from source: unknown 24971 1727096415.85093: calling self._execute() 24971 1727096415.85157: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.85161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.85377: variable 'omit' from source: magic vars 24971 1727096415.86276: variable 'ansible_distribution' from source: facts 24971 1727096415.86296: Evaluated conditional (ansible_distribution == 'CentOS'): True 24971 1727096415.86511: variable 'ansible_distribution_major_version' from source: facts 24971 1727096415.86517: Evaluated conditional (ansible_distribution_major_version == '6'): False 24971 1727096415.86520: when evaluation is False, skipping this task 24971 1727096415.86523: _execute() done 24971 1727096415.86525: dumping result to json 24971 1727096415.86528: done dumping result, returning 24971 1727096415.86535: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0afff68d-5257-3482-6844-0000000000cf] 24971 1727096415.86539: sending task result for task 0afff68d-5257-3482-6844-0000000000cf 24971 1727096415.86633: done sending task result for task 0afff68d-5257-3482-6844-0000000000cf 24971 1727096415.86636: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24971 1727096415.86703: no more pending results, returning what we have 24971 1727096415.86706: results queue empty 24971 1727096415.86707: checking for any_errors_fatal 24971 1727096415.86712: done checking for any_errors_fatal 24971 1727096415.86712: checking for max_fail_percentage 24971 1727096415.86714: done checking for max_fail_percentage 24971 1727096415.86715: checking to see if all hosts have failed and the running result is not ok 24971 1727096415.86715: done checking to see if all hosts have failed 24971 1727096415.86716: getting the remaining hosts for this loop 24971 1727096415.86717: done getting the remaining hosts for this loop 24971 1727096415.86722: getting the next task for host managed_node3 24971 1727096415.86728: done getting next task for host managed_node3 24971 1727096415.86730: ^ task is: TASK: Include the task 'enable_epel.yml' 24971 1727096415.86733: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096415.86737: getting variables 24971 1727096415.86738: in VariableManager get_vars() 24971 1727096415.86994: Calling all_inventory to load vars for managed_node3 24971 1727096415.86997: Calling groups_inventory to load vars for managed_node3 24971 1727096415.87001: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096415.87012: Calling all_plugins_play to load vars for managed_node3 24971 1727096415.87014: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096415.87021: Calling groups_plugins_play to load vars for managed_node3 24971 1727096415.87353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096415.87750: done with get_vars() 24971 1727096415.87759: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 09:00:15 -0400 (0:00:00.042) 0:00:03.357 ****** 24971 1727096415.87994: entering _queue_task() for managed_node3/include_tasks 24971 1727096415.88672: worker is 1 (out of 1 available) 24971 1727096415.88684: exiting _queue_task() for managed_node3/include_tasks 24971 1727096415.88696: done queuing things up, now waiting for results queue to drain 24971 1727096415.88697: waiting for pending results... 24971 1727096415.89227: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 24971 1727096415.89232: in run() - task 0afff68d-5257-3482-6844-0000000000d0 24971 1727096415.89236: variable 'ansible_search_path' from source: unknown 24971 1727096415.89238: variable 'ansible_search_path' from source: unknown 24971 1727096415.89241: calling self._execute() 24971 1727096415.89674: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.89678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.89680: variable 'omit' from source: magic vars 24971 1727096415.90474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096415.94296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096415.94578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096415.94594: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096415.94701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096415.94733: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096415.94912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096415.94947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096415.94984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096415.95039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096415.95060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096415.95186: variable '__network_is_ostree' from source: set_fact 24971 1727096415.95210: Evaluated conditional (not __network_is_ostree | d(false)): True 24971 1727096415.95223: _execute() done 24971 1727096415.95231: dumping result to json 24971 1727096415.95239: done dumping result, returning 24971 1727096415.95251: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-3482-6844-0000000000d0] 24971 1727096415.95261: sending task result for task 0afff68d-5257-3482-6844-0000000000d0 24971 1727096415.95422: no more pending results, returning what we have 24971 1727096415.95427: in VariableManager get_vars() 24971 1727096415.95463: Calling all_inventory to load vars for managed_node3 24971 1727096415.95469: Calling groups_inventory to load vars for managed_node3 24971 1727096415.95473: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096415.95485: Calling all_plugins_play to load vars for managed_node3 24971 1727096415.95488: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096415.95492: Calling groups_plugins_play to load vars for managed_node3 24971 1727096415.95925: done sending task result for task 0afff68d-5257-3482-6844-0000000000d0 24971 1727096415.95929: WORKER PROCESS EXITING 24971 1727096415.95952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096415.96142: done with get_vars() 24971 1727096415.96151: variable 'ansible_search_path' from source: unknown 24971 1727096415.96152: variable 'ansible_search_path' from source: unknown 24971 1727096415.96192: we have included files to process 24971 1727096415.96194: generating all_blocks data 24971 1727096415.96195: done generating all_blocks data 24971 1727096415.96200: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24971 1727096415.96202: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24971 1727096415.96204: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24971 1727096415.96931: done processing included file 24971 1727096415.96934: iterating over new_blocks loaded from include file 24971 1727096415.96935: in VariableManager get_vars() 24971 1727096415.96946: done with get_vars() 24971 1727096415.96948: filtering new block on tags 24971 1727096415.96971: done filtering new block on tags 24971 1727096415.96974: in VariableManager get_vars() 24971 1727096415.96984: done with get_vars() 24971 1727096415.96986: filtering new block on tags 24971 1727096415.96996: done filtering new block on tags 24971 1727096415.96998: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 24971 1727096415.97003: extending task lists for all hosts with included blocks 24971 1727096415.97105: done extending task lists 24971 1727096415.97107: done processing included files 24971 1727096415.97107: results queue empty 24971 1727096415.97108: checking for any_errors_fatal 24971 1727096415.97111: done checking for any_errors_fatal 24971 1727096415.97111: checking for max_fail_percentage 24971 1727096415.97112: done checking for max_fail_percentage 24971 1727096415.97113: checking to see if all hosts have failed and the running result is not ok 24971 1727096415.97114: done checking to see if all hosts have failed 24971 1727096415.97114: getting the remaining hosts for this loop 24971 1727096415.97115: done getting the remaining hosts for this loop 24971 1727096415.97118: getting the next task for host managed_node3 24971 1727096415.97121: done getting next task for host managed_node3 24971 1727096415.97123: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 24971 1727096415.97126: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096415.97128: getting variables 24971 1727096415.97129: in VariableManager get_vars() 24971 1727096415.97137: Calling all_inventory to load vars for managed_node3 24971 1727096415.97139: Calling groups_inventory to load vars for managed_node3 24971 1727096415.97141: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096415.97146: Calling all_plugins_play to load vars for managed_node3 24971 1727096415.97154: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096415.97157: Calling groups_plugins_play to load vars for managed_node3 24971 1727096415.97511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096415.97899: done with get_vars() 24971 1727096415.97908: done getting variables 24971 1727096415.97971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24971 1727096415.98370: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 09:00:15 -0400 (0:00:00.104) 0:00:03.461 ****** 24971 1727096415.98413: entering _queue_task() for managed_node3/command 24971 1727096415.98415: Creating lock for command 24971 1727096415.98753: worker is 1 (out of 1 available) 24971 1727096415.98765: exiting _queue_task() for managed_node3/command 24971 1727096415.98780: done queuing things up, now waiting for results queue to drain 24971 1727096415.98781: waiting for pending results... 24971 1727096415.99022: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 24971 1727096415.99175: in run() - task 0afff68d-5257-3482-6844-0000000000ea 24971 1727096415.99179: variable 'ansible_search_path' from source: unknown 24971 1727096415.99182: variable 'ansible_search_path' from source: unknown 24971 1727096415.99197: calling self._execute() 24971 1727096415.99274: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096415.99290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096415.99305: variable 'omit' from source: magic vars 24971 1727096415.99669: variable 'ansible_distribution' from source: facts 24971 1727096415.99685: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24971 1727096415.99816: variable 'ansible_distribution_major_version' from source: facts 24971 1727096415.99934: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24971 1727096415.99937: when evaluation is False, skipping this task 24971 1727096415.99940: _execute() done 24971 1727096415.99942: dumping result to json 24971 1727096415.99944: done dumping result, returning 24971 1727096415.99946: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0afff68d-5257-3482-6844-0000000000ea] 24971 1727096415.99948: sending task result for task 0afff68d-5257-3482-6844-0000000000ea 24971 1727096416.00024: done sending task result for task 0afff68d-5257-3482-6844-0000000000ea 24971 1727096416.00027: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24971 1727096416.00083: no more pending results, returning what we have 24971 1727096416.00087: results queue empty 24971 1727096416.00088: checking for any_errors_fatal 24971 1727096416.00089: done checking for any_errors_fatal 24971 1727096416.00090: checking for max_fail_percentage 24971 1727096416.00092: done checking for max_fail_percentage 24971 1727096416.00092: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.00093: done checking to see if all hosts have failed 24971 1727096416.00094: getting the remaining hosts for this loop 24971 1727096416.00095: done getting the remaining hosts for this loop 24971 1727096416.00099: getting the next task for host managed_node3 24971 1727096416.00106: done getting next task for host managed_node3 24971 1727096416.00109: ^ task is: TASK: Install yum-utils package 24971 1727096416.00113: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.00117: getting variables 24971 1727096416.00118: in VariableManager get_vars() 24971 1727096416.00148: Calling all_inventory to load vars for managed_node3 24971 1727096416.00150: Calling groups_inventory to load vars for managed_node3 24971 1727096416.00154: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.00170: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.00173: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.00176: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.00694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.00904: done with get_vars() 24971 1727096416.00914: done getting variables 24971 1727096416.01010: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 09:00:16 -0400 (0:00:00.026) 0:00:03.488 ****** 24971 1727096416.01036: entering _queue_task() for managed_node3/package 24971 1727096416.01038: Creating lock for package 24971 1727096416.01483: worker is 1 (out of 1 available) 24971 1727096416.01491: exiting _queue_task() for managed_node3/package 24971 1727096416.01501: done queuing things up, now waiting for results queue to drain 24971 1727096416.01502: waiting for pending results... 24971 1727096416.01626: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 24971 1727096416.01677: in run() - task 0afff68d-5257-3482-6844-0000000000eb 24971 1727096416.01680: variable 'ansible_search_path' from source: unknown 24971 1727096416.01682: variable 'ansible_search_path' from source: unknown 24971 1727096416.01834: calling self._execute() 24971 1727096416.01837: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.01839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.01842: variable 'omit' from source: magic vars 24971 1727096416.02171: variable 'ansible_distribution' from source: facts 24971 1727096416.02190: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24971 1727096416.02323: variable 'ansible_distribution_major_version' from source: facts 24971 1727096416.02334: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24971 1727096416.02342: when evaluation is False, skipping this task 24971 1727096416.02349: _execute() done 24971 1727096416.02355: dumping result to json 24971 1727096416.02362: done dumping result, returning 24971 1727096416.02379: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0afff68d-5257-3482-6844-0000000000eb] 24971 1727096416.02389: sending task result for task 0afff68d-5257-3482-6844-0000000000eb 24971 1727096416.02675: done sending task result for task 0afff68d-5257-3482-6844-0000000000eb 24971 1727096416.02678: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24971 1727096416.02712: no more pending results, returning what we have 24971 1727096416.02714: results queue empty 24971 1727096416.02715: checking for any_errors_fatal 24971 1727096416.02719: done checking for any_errors_fatal 24971 1727096416.02720: checking for max_fail_percentage 24971 1727096416.02721: done checking for max_fail_percentage 24971 1727096416.02722: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.02723: done checking to see if all hosts have failed 24971 1727096416.02724: getting the remaining hosts for this loop 24971 1727096416.02725: done getting the remaining hosts for this loop 24971 1727096416.02728: getting the next task for host managed_node3 24971 1727096416.02733: done getting next task for host managed_node3 24971 1727096416.02735: ^ task is: TASK: Enable EPEL 7 24971 1727096416.02738: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.02742: getting variables 24971 1727096416.02743: in VariableManager get_vars() 24971 1727096416.02770: Calling all_inventory to load vars for managed_node3 24971 1727096416.02773: Calling groups_inventory to load vars for managed_node3 24971 1727096416.02776: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.02787: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.02789: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.02792: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.02985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.03184: done with get_vars() 24971 1727096416.03194: done getting variables 24971 1727096416.03251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 09:00:16 -0400 (0:00:00.022) 0:00:03.510 ****** 24971 1727096416.03283: entering _queue_task() for managed_node3/command 24971 1727096416.03511: worker is 1 (out of 1 available) 24971 1727096416.03523: exiting _queue_task() for managed_node3/command 24971 1727096416.03534: done queuing things up, now waiting for results queue to drain 24971 1727096416.03534: waiting for pending results... 24971 1727096416.03772: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 24971 1727096416.03883: in run() - task 0afff68d-5257-3482-6844-0000000000ec 24971 1727096416.03903: variable 'ansible_search_path' from source: unknown 24971 1727096416.03910: variable 'ansible_search_path' from source: unknown 24971 1727096416.03945: calling self._execute() 24971 1727096416.04022: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.04033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.04047: variable 'omit' from source: magic vars 24971 1727096416.04398: variable 'ansible_distribution' from source: facts 24971 1727096416.04415: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24971 1727096416.04550: variable 'ansible_distribution_major_version' from source: facts 24971 1727096416.04561: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24971 1727096416.04570: when evaluation is False, skipping this task 24971 1727096416.04579: _execute() done 24971 1727096416.04593: dumping result to json 24971 1727096416.04652: done dumping result, returning 24971 1727096416.04655: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0afff68d-5257-3482-6844-0000000000ec] 24971 1727096416.04658: sending task result for task 0afff68d-5257-3482-6844-0000000000ec 24971 1727096416.04719: done sending task result for task 0afff68d-5257-3482-6844-0000000000ec 24971 1727096416.04723: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24971 1727096416.04806: no more pending results, returning what we have 24971 1727096416.04809: results queue empty 24971 1727096416.04810: checking for any_errors_fatal 24971 1727096416.04816: done checking for any_errors_fatal 24971 1727096416.04816: checking for max_fail_percentage 24971 1727096416.04818: done checking for max_fail_percentage 24971 1727096416.04819: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.04819: done checking to see if all hosts have failed 24971 1727096416.04820: getting the remaining hosts for this loop 24971 1727096416.04822: done getting the remaining hosts for this loop 24971 1727096416.04826: getting the next task for host managed_node3 24971 1727096416.04832: done getting next task for host managed_node3 24971 1727096416.04834: ^ task is: TASK: Enable EPEL 8 24971 1727096416.04839: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.04843: getting variables 24971 1727096416.04844: in VariableManager get_vars() 24971 1727096416.04876: Calling all_inventory to load vars for managed_node3 24971 1727096416.04879: Calling groups_inventory to load vars for managed_node3 24971 1727096416.04883: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.04896: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.04899: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.04901: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.05257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.05445: done with get_vars() 24971 1727096416.05454: done getting variables 24971 1727096416.05509: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 09:00:16 -0400 (0:00:00.022) 0:00:03.533 ****** 24971 1727096416.05536: entering _queue_task() for managed_node3/command 24971 1727096416.05758: worker is 1 (out of 1 available) 24971 1727096416.05972: exiting _queue_task() for managed_node3/command 24971 1727096416.05983: done queuing things up, now waiting for results queue to drain 24971 1727096416.05984: waiting for pending results... 24971 1727096416.06086: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 24971 1727096416.06126: in run() - task 0afff68d-5257-3482-6844-0000000000ed 24971 1727096416.06146: variable 'ansible_search_path' from source: unknown 24971 1727096416.06155: variable 'ansible_search_path' from source: unknown 24971 1727096416.06196: calling self._execute() 24971 1727096416.06317: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.06321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.06325: variable 'omit' from source: magic vars 24971 1727096416.06677: variable 'ansible_distribution' from source: facts 24971 1727096416.06695: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24971 1727096416.06827: variable 'ansible_distribution_major_version' from source: facts 24971 1727096416.06837: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24971 1727096416.06860: when evaluation is False, skipping this task 24971 1727096416.06862: _execute() done 24971 1727096416.06865: dumping result to json 24971 1727096416.06867: done dumping result, returning 24971 1727096416.06871: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0afff68d-5257-3482-6844-0000000000ed] 24971 1727096416.06973: sending task result for task 0afff68d-5257-3482-6844-0000000000ed 24971 1727096416.07035: done sending task result for task 0afff68d-5257-3482-6844-0000000000ed 24971 1727096416.07038: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24971 1727096416.07112: no more pending results, returning what we have 24971 1727096416.07115: results queue empty 24971 1727096416.07116: checking for any_errors_fatal 24971 1727096416.07121: done checking for any_errors_fatal 24971 1727096416.07122: checking for max_fail_percentage 24971 1727096416.07124: done checking for max_fail_percentage 24971 1727096416.07124: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.07125: done checking to see if all hosts have failed 24971 1727096416.07126: getting the remaining hosts for this loop 24971 1727096416.07127: done getting the remaining hosts for this loop 24971 1727096416.07130: getting the next task for host managed_node3 24971 1727096416.07139: done getting next task for host managed_node3 24971 1727096416.07142: ^ task is: TASK: Enable EPEL 6 24971 1727096416.07146: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.07150: getting variables 24971 1727096416.07151: in VariableManager get_vars() 24971 1727096416.07182: Calling all_inventory to load vars for managed_node3 24971 1727096416.07185: Calling groups_inventory to load vars for managed_node3 24971 1727096416.07189: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.07201: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.07204: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.07207: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.07498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.07688: done with get_vars() 24971 1727096416.07697: done getting variables 24971 1727096416.07749: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 09:00:16 -0400 (0:00:00.022) 0:00:03.555 ****** 24971 1727096416.07778: entering _queue_task() for managed_node3/copy 24971 1727096416.08096: worker is 1 (out of 1 available) 24971 1727096416.08106: exiting _queue_task() for managed_node3/copy 24971 1727096416.08117: done queuing things up, now waiting for results queue to drain 24971 1727096416.08118: waiting for pending results... 24971 1727096416.08255: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 24971 1727096416.08449: in run() - task 0afff68d-5257-3482-6844-0000000000ef 24971 1727096416.08453: variable 'ansible_search_path' from source: unknown 24971 1727096416.08456: variable 'ansible_search_path' from source: unknown 24971 1727096416.08458: calling self._execute() 24971 1727096416.08490: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.08501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.08515: variable 'omit' from source: magic vars 24971 1727096416.08856: variable 'ansible_distribution' from source: facts 24971 1727096416.08875: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24971 1727096416.08990: variable 'ansible_distribution_major_version' from source: facts 24971 1727096416.09005: Evaluated conditional (ansible_distribution_major_version == '6'): False 24971 1727096416.09017: when evaluation is False, skipping this task 24971 1727096416.09025: _execute() done 24971 1727096416.09030: dumping result to json 24971 1727096416.09036: done dumping result, returning 24971 1727096416.09073: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0afff68d-5257-3482-6844-0000000000ef] 24971 1727096416.09077: sending task result for task 0afff68d-5257-3482-6844-0000000000ef 24971 1727096416.09302: done sending task result for task 0afff68d-5257-3482-6844-0000000000ef 24971 1727096416.09305: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24971 1727096416.09339: no more pending results, returning what we have 24971 1727096416.09342: results queue empty 24971 1727096416.09343: checking for any_errors_fatal 24971 1727096416.09347: done checking for any_errors_fatal 24971 1727096416.09348: checking for max_fail_percentage 24971 1727096416.09349: done checking for max_fail_percentage 24971 1727096416.09350: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.09351: done checking to see if all hosts have failed 24971 1727096416.09352: getting the remaining hosts for this loop 24971 1727096416.09353: done getting the remaining hosts for this loop 24971 1727096416.09356: getting the next task for host managed_node3 24971 1727096416.09363: done getting next task for host managed_node3 24971 1727096416.09365: ^ task is: TASK: Set network provider to 'nm' 24971 1727096416.09369: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.09373: getting variables 24971 1727096416.09374: in VariableManager get_vars() 24971 1727096416.09399: Calling all_inventory to load vars for managed_node3 24971 1727096416.09402: Calling groups_inventory to load vars for managed_node3 24971 1727096416.09405: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.09414: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.09416: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.09419: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.09654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.10041: done with get_vars() 24971 1727096416.10049: done getting variables 24971 1727096416.10208: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Monday 23 September 2024 09:00:16 -0400 (0:00:00.024) 0:00:03.580 ****** 24971 1727096416.10233: entering _queue_task() for managed_node3/set_fact 24971 1727096416.10749: worker is 1 (out of 1 available) 24971 1727096416.10761: exiting _queue_task() for managed_node3/set_fact 24971 1727096416.10775: done queuing things up, now waiting for results queue to drain 24971 1727096416.10776: waiting for pending results... 24971 1727096416.11066: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 24971 1727096416.11172: in run() - task 0afff68d-5257-3482-6844-000000000007 24971 1727096416.11190: variable 'ansible_search_path' from source: unknown 24971 1727096416.11227: calling self._execute() 24971 1727096416.11303: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.11314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.11327: variable 'omit' from source: magic vars 24971 1727096416.11435: variable 'omit' from source: magic vars 24971 1727096416.11474: variable 'omit' from source: magic vars 24971 1727096416.11518: variable 'omit' from source: magic vars 24971 1727096416.11562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096416.11606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096416.11629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096416.11650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096416.11664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096416.11707: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096416.11817: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.11820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.11834: Set connection var ansible_shell_type to sh 24971 1727096416.11850: Set connection var ansible_shell_executable to /bin/sh 24971 1727096416.11866: Set connection var ansible_timeout to 10 24971 1727096416.11878: Set connection var ansible_connection to ssh 24971 1727096416.11887: Set connection var ansible_pipelining to False 24971 1727096416.11895: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096416.11918: variable 'ansible_shell_executable' from source: unknown 24971 1727096416.11929: variable 'ansible_connection' from source: unknown 24971 1727096416.11936: variable 'ansible_module_compression' from source: unknown 24971 1727096416.11942: variable 'ansible_shell_type' from source: unknown 24971 1727096416.11948: variable 'ansible_shell_executable' from source: unknown 24971 1727096416.11954: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.11960: variable 'ansible_pipelining' from source: unknown 24971 1727096416.11966: variable 'ansible_timeout' from source: unknown 24971 1727096416.11976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.12116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096416.12142: variable 'omit' from source: magic vars 24971 1727096416.12145: starting attempt loop 24971 1727096416.12148: running the handler 24971 1727096416.12175: handler run complete 24971 1727096416.12254: attempt loop complete, returning result 24971 1727096416.12257: _execute() done 24971 1727096416.12259: dumping result to json 24971 1727096416.12261: done dumping result, returning 24971 1727096416.12263: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0afff68d-5257-3482-6844-000000000007] 24971 1727096416.12265: sending task result for task 0afff68d-5257-3482-6844-000000000007 24971 1727096416.12329: done sending task result for task 0afff68d-5257-3482-6844-000000000007 24971 1727096416.12332: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 24971 1727096416.12414: no more pending results, returning what we have 24971 1727096416.12416: results queue empty 24971 1727096416.12417: checking for any_errors_fatal 24971 1727096416.12423: done checking for any_errors_fatal 24971 1727096416.12424: checking for max_fail_percentage 24971 1727096416.12426: done checking for max_fail_percentage 24971 1727096416.12427: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.12428: done checking to see if all hosts have failed 24971 1727096416.12428: getting the remaining hosts for this loop 24971 1727096416.12430: done getting the remaining hosts for this loop 24971 1727096416.12434: getting the next task for host managed_node3 24971 1727096416.12441: done getting next task for host managed_node3 24971 1727096416.12443: ^ task is: TASK: meta (flush_handlers) 24971 1727096416.12445: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.12449: getting variables 24971 1727096416.12451: in VariableManager get_vars() 24971 1727096416.12683: Calling all_inventory to load vars for managed_node3 24971 1727096416.12686: Calling groups_inventory to load vars for managed_node3 24971 1727096416.12688: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.12697: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.12700: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.12702: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.12861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.13053: done with get_vars() 24971 1727096416.13062: done getting variables 24971 1727096416.13123: in VariableManager get_vars() 24971 1727096416.13135: Calling all_inventory to load vars for managed_node3 24971 1727096416.13137: Calling groups_inventory to load vars for managed_node3 24971 1727096416.13140: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.13144: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.13146: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.13149: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.13282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.13670: done with get_vars() 24971 1727096416.13683: done queuing things up, now waiting for results queue to drain 24971 1727096416.13685: results queue empty 24971 1727096416.13686: checking for any_errors_fatal 24971 1727096416.13687: done checking for any_errors_fatal 24971 1727096416.13688: checking for max_fail_percentage 24971 1727096416.13689: done checking for max_fail_percentage 24971 1727096416.13690: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.13690: done checking to see if all hosts have failed 24971 1727096416.13691: getting the remaining hosts for this loop 24971 1727096416.13692: done getting the remaining hosts for this loop 24971 1727096416.13694: getting the next task for host managed_node3 24971 1727096416.13699: done getting next task for host managed_node3 24971 1727096416.13700: ^ task is: TASK: meta (flush_handlers) 24971 1727096416.13701: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.13711: getting variables 24971 1727096416.13713: in VariableManager get_vars() 24971 1727096416.13720: Calling all_inventory to load vars for managed_node3 24971 1727096416.13722: Calling groups_inventory to load vars for managed_node3 24971 1727096416.13724: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.13729: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.13731: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.13733: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.13913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.14191: done with get_vars() 24971 1727096416.14198: done getting variables 24971 1727096416.14240: in VariableManager get_vars() 24971 1727096416.14249: Calling all_inventory to load vars for managed_node3 24971 1727096416.14251: Calling groups_inventory to load vars for managed_node3 24971 1727096416.14254: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.14257: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.14260: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.14263: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.14497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.14898: done with get_vars() 24971 1727096416.14908: done queuing things up, now waiting for results queue to drain 24971 1727096416.14910: results queue empty 24971 1727096416.14910: checking for any_errors_fatal 24971 1727096416.14912: done checking for any_errors_fatal 24971 1727096416.14912: checking for max_fail_percentage 24971 1727096416.14913: done checking for max_fail_percentage 24971 1727096416.14914: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.14915: done checking to see if all hosts have failed 24971 1727096416.14915: getting the remaining hosts for this loop 24971 1727096416.14916: done getting the remaining hosts for this loop 24971 1727096416.14918: getting the next task for host managed_node3 24971 1727096416.14920: done getting next task for host managed_node3 24971 1727096416.14921: ^ task is: None 24971 1727096416.14923: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.14924: done queuing things up, now waiting for results queue to drain 24971 1727096416.14924: results queue empty 24971 1727096416.14925: checking for any_errors_fatal 24971 1727096416.14926: done checking for any_errors_fatal 24971 1727096416.14926: checking for max_fail_percentage 24971 1727096416.14927: done checking for max_fail_percentage 24971 1727096416.14928: checking to see if all hosts have failed and the running result is not ok 24971 1727096416.14929: done checking to see if all hosts have failed 24971 1727096416.14930: getting the next task for host managed_node3 24971 1727096416.14932: done getting next task for host managed_node3 24971 1727096416.14933: ^ task is: None 24971 1727096416.14934: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.15153: in VariableManager get_vars() 24971 1727096416.15181: done with get_vars() 24971 1727096416.15188: in VariableManager get_vars() 24971 1727096416.15203: done with get_vars() 24971 1727096416.15208: variable 'omit' from source: magic vars 24971 1727096416.15238: in VariableManager get_vars() 24971 1727096416.15253: done with get_vars() 24971 1727096416.15276: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 24971 1727096416.15702: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24971 1727096416.15727: getting the remaining hosts for this loop 24971 1727096416.15728: done getting the remaining hosts for this loop 24971 1727096416.15730: getting the next task for host managed_node3 24971 1727096416.15733: done getting next task for host managed_node3 24971 1727096416.15735: ^ task is: TASK: Gathering Facts 24971 1727096416.15736: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096416.15738: getting variables 24971 1727096416.15739: in VariableManager get_vars() 24971 1727096416.15751: Calling all_inventory to load vars for managed_node3 24971 1727096416.15753: Calling groups_inventory to load vars for managed_node3 24971 1727096416.15755: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096416.15759: Calling all_plugins_play to load vars for managed_node3 24971 1727096416.15775: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096416.15779: Calling groups_plugins_play to load vars for managed_node3 24971 1727096416.15911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096416.16096: done with get_vars() 24971 1727096416.16104: done getting variables 24971 1727096416.16140: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Monday 23 September 2024 09:00:16 -0400 (0:00:00.059) 0:00:03.639 ****** 24971 1727096416.16161: entering _queue_task() for managed_node3/gather_facts 24971 1727096416.16407: worker is 1 (out of 1 available) 24971 1727096416.16417: exiting _queue_task() for managed_node3/gather_facts 24971 1727096416.16427: done queuing things up, now waiting for results queue to drain 24971 1727096416.16428: waiting for pending results... 24971 1727096416.16782: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24971 1727096416.16786: in run() - task 0afff68d-5257-3482-6844-000000000115 24971 1727096416.16791: variable 'ansible_search_path' from source: unknown 24971 1727096416.16794: calling self._execute() 24971 1727096416.16848: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.16861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.16878: variable 'omit' from source: magic vars 24971 1727096416.17431: variable 'ansible_distribution_major_version' from source: facts 24971 1727096416.17674: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096416.17678: variable 'omit' from source: magic vars 24971 1727096416.17682: variable 'omit' from source: magic vars 24971 1727096416.17685: variable 'omit' from source: magic vars 24971 1727096416.17783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096416.17827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096416.17851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096416.18060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096416.18081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096416.18203: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096416.18212: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.18219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.18474: Set connection var ansible_shell_type to sh 24971 1727096416.18478: Set connection var ansible_shell_executable to /bin/sh 24971 1727096416.18481: Set connection var ansible_timeout to 10 24971 1727096416.18483: Set connection var ansible_connection to ssh 24971 1727096416.18486: Set connection var ansible_pipelining to False 24971 1727096416.18489: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096416.18491: variable 'ansible_shell_executable' from source: unknown 24971 1727096416.18494: variable 'ansible_connection' from source: unknown 24971 1727096416.18496: variable 'ansible_module_compression' from source: unknown 24971 1727096416.18499: variable 'ansible_shell_type' from source: unknown 24971 1727096416.18501: variable 'ansible_shell_executable' from source: unknown 24971 1727096416.18503: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096416.18551: variable 'ansible_pipelining' from source: unknown 24971 1727096416.18559: variable 'ansible_timeout' from source: unknown 24971 1727096416.18566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096416.18752: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096416.18836: variable 'omit' from source: magic vars 24971 1727096416.18846: starting attempt loop 24971 1727096416.18854: running the handler 24971 1727096416.18877: variable 'ansible_facts' from source: unknown 24971 1727096416.18901: _low_level_execute_command(): starting 24971 1727096416.18916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096416.19680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096416.19704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096416.19757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096416.21986: stdout chunk (state=3): >>>/root <<< 24971 1727096416.22116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096416.22143: stderr chunk (state=3): >>><<< 24971 1727096416.22147: stdout chunk (state=3): >>><<< 24971 1727096416.22167: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096416.22187: _low_level_execute_command(): starting 24971 1727096416.22192: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501 `" && echo ansible-tmp-1727096416.2217228-25152-132889706977501="` echo /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501 `" ) && sleep 0' 24971 1727096416.22713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096416.22764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096416.22798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096416.22864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096416.25535: stdout chunk (state=3): >>>ansible-tmp-1727096416.2217228-25152-132889706977501=/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501 <<< 24971 1727096416.25735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096416.25739: stdout chunk (state=3): >>><<< 24971 1727096416.25741: stderr chunk (state=3): >>><<< 24971 1727096416.25874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096416.2217228-25152-132889706977501=/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096416.25878: variable 'ansible_module_compression' from source: unknown 24971 1727096416.25880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24971 1727096416.25911: variable 'ansible_facts' from source: unknown 24971 1727096416.26145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py 24971 1727096416.26337: Sending initial data 24971 1727096416.26340: Sent initial data (154 bytes) 24971 1727096416.26891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096416.26916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096416.26919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096416.26922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096416.26971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096416.26975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096416.26986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096416.27032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096416.29293: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096416.29329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096416.29363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpmp4jo3s9 /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py <<< 24971 1727096416.29373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py" <<< 24971 1727096416.29401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpmp4jo3s9" to remote "/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py" <<< 24971 1727096416.29404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py" <<< 24971 1727096416.30581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096416.30589: stderr chunk (state=3): >>><<< 24971 1727096416.30601: stdout chunk (state=3): >>><<< 24971 1727096416.30626: done transferring module to remote 24971 1727096416.30707: _low_level_execute_command(): starting 24971 1727096416.30711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/ /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py && sleep 0' 24971 1727096416.31275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096416.31289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096416.31304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096416.31333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096416.31419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096416.31446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096416.31462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096416.31525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096416.34076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096416.34084: stdout chunk (state=3): >>><<< 24971 1727096416.34087: stderr chunk (state=3): >>><<< 24971 1727096416.34184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096416.34188: _low_level_execute_command(): starting 24971 1727096416.34191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/AnsiballZ_setup.py && sleep 0' 24971 1727096416.34896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096416.34993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096416.35234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.17380: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93<<< 24971 1727096417.17429: stdout chunk (state=3): >>>+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2980, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 551, "free": 2980}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 559, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803024384, "block_size": 4096, "block_total": 65519099, "block_available": 63916754, "block_used": 1602345, "inode_total": 131070960, "inode_available": 131029177, "inode_used": 41783, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.43359375, "5m": 0.50732421875, "15m": 0.31396484375}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "17", "epoch": "1727096417", "epoch_int": "1727096417", "date": "2024-09-23", "time": "09:00:17", "iso8601_micro": "2024-09-23T13:00:17.111523Z", "iso8601": "2024-09-23T13:00:17Z", "iso8601_basic": "20240923T090017111523", "iso8601_basic_short": "20240923T090017", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offlo<<< 24971 1727096417.17469: stdout chunk (state=3): >>>ad": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24971 1727096417.20211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096417.20215: stdout chunk (state=3): >>><<< 24971 1727096417.20217: stderr chunk (state=3): >>><<< 24971 1727096417.20260: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2980, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 551, "free": 2980}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 559, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803024384, "block_size": 4096, "block_total": 65519099, "block_available": 63916754, "block_used": 1602345, "inode_total": 131070960, "inode_available": 131029177, "inode_used": 41783, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.43359375, "5m": 0.50732421875, "15m": 0.31396484375}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "00", "second": "17", "epoch": "1727096417", "epoch_int": "1727096417", "date": "2024-09-23", "time": "09:00:17", "iso8601_micro": "2024-09-23T13:00:17.111523Z", "iso8601": "2024-09-23T13:00:17Z", "iso8601_basic": "20240923T090017111523", "iso8601_basic_short": "20240923T090017", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096417.20728: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096417.20732: _low_level_execute_command(): starting 24971 1727096417.20734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096416.2217228-25152-132889706977501/ > /dev/null 2>&1 && sleep 0' 24971 1727096417.21308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096417.21388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.21438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.21457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.21480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.21549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.24057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.24063: stdout chunk (state=3): >>><<< 24971 1727096417.24071: stderr chunk (state=3): >>><<< 24971 1727096417.24087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096417.24095: handler run complete 24971 1727096417.24170: variable 'ansible_facts' from source: unknown 24971 1727096417.24233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.24416: variable 'ansible_facts' from source: unknown 24971 1727096417.24479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.24554: attempt loop complete, returning result 24971 1727096417.24557: _execute() done 24971 1727096417.24559: dumping result to json 24971 1727096417.24582: done dumping result, returning 24971 1727096417.24589: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-3482-6844-000000000115] 24971 1727096417.24594: sending task result for task 0afff68d-5257-3482-6844-000000000115 ok: [managed_node3] 24971 1727096417.25063: no more pending results, returning what we have 24971 1727096417.25065: results queue empty 24971 1727096417.25065: checking for any_errors_fatal 24971 1727096417.25066: done checking for any_errors_fatal 24971 1727096417.25066: checking for max_fail_percentage 24971 1727096417.25069: done checking for max_fail_percentage 24971 1727096417.25070: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.25071: done checking to see if all hosts have failed 24971 1727096417.25071: getting the remaining hosts for this loop 24971 1727096417.25072: done getting the remaining hosts for this loop 24971 1727096417.25074: getting the next task for host managed_node3 24971 1727096417.25078: done getting next task for host managed_node3 24971 1727096417.25079: ^ task is: TASK: meta (flush_handlers) 24971 1727096417.25080: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.25082: getting variables 24971 1727096417.25083: in VariableManager get_vars() 24971 1727096417.25110: Calling all_inventory to load vars for managed_node3 24971 1727096417.25112: Calling groups_inventory to load vars for managed_node3 24971 1727096417.25113: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.25122: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.25124: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.25126: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.25229: done sending task result for task 0afff68d-5257-3482-6844-000000000115 24971 1727096417.25233: WORKER PROCESS EXITING 24971 1727096417.25242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.25358: done with get_vars() 24971 1727096417.25365: done getting variables 24971 1727096417.25416: in VariableManager get_vars() 24971 1727096417.25427: Calling all_inventory to load vars for managed_node3 24971 1727096417.25429: Calling groups_inventory to load vars for managed_node3 24971 1727096417.25430: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.25433: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.25434: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.25436: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.25533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.25704: done with get_vars() 24971 1727096417.25716: done queuing things up, now waiting for results queue to drain 24971 1727096417.25718: results queue empty 24971 1727096417.25719: checking for any_errors_fatal 24971 1727096417.25721: done checking for any_errors_fatal 24971 1727096417.25722: checking for max_fail_percentage 24971 1727096417.25723: done checking for max_fail_percentage 24971 1727096417.25724: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.25729: done checking to see if all hosts have failed 24971 1727096417.25730: getting the remaining hosts for this loop 24971 1727096417.25730: done getting the remaining hosts for this loop 24971 1727096417.25733: getting the next task for host managed_node3 24971 1727096417.25736: done getting next task for host managed_node3 24971 1727096417.25738: ^ task is: TASK: Include the task 'show_interfaces.yml' 24971 1727096417.25740: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.25742: getting variables 24971 1727096417.25742: in VariableManager get_vars() 24971 1727096417.25754: Calling all_inventory to load vars for managed_node3 24971 1727096417.25756: Calling groups_inventory to load vars for managed_node3 24971 1727096417.25757: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.25762: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.25764: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.25771: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.25930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.26155: done with get_vars() 24971 1727096417.26162: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Monday 23 September 2024 09:00:17 -0400 (0:00:01.100) 0:00:04.740 ****** 24971 1727096417.26237: entering _queue_task() for managed_node3/include_tasks 24971 1727096417.26598: worker is 1 (out of 1 available) 24971 1727096417.26611: exiting _queue_task() for managed_node3/include_tasks 24971 1727096417.26622: done queuing things up, now waiting for results queue to drain 24971 1727096417.26623: waiting for pending results... 24971 1727096417.26755: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 24971 1727096417.26818: in run() - task 0afff68d-5257-3482-6844-00000000000b 24971 1727096417.26828: variable 'ansible_search_path' from source: unknown 24971 1727096417.26857: calling self._execute() 24971 1727096417.26918: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.26923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.26931: variable 'omit' from source: magic vars 24971 1727096417.27196: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.27207: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.27211: _execute() done 24971 1727096417.27216: dumping result to json 24971 1727096417.27220: done dumping result, returning 24971 1727096417.27223: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-3482-6844-00000000000b] 24971 1727096417.27234: sending task result for task 0afff68d-5257-3482-6844-00000000000b 24971 1727096417.27308: done sending task result for task 0afff68d-5257-3482-6844-00000000000b 24971 1727096417.27310: WORKER PROCESS EXITING 24971 1727096417.27342: no more pending results, returning what we have 24971 1727096417.27347: in VariableManager get_vars() 24971 1727096417.27389: Calling all_inventory to load vars for managed_node3 24971 1727096417.27392: Calling groups_inventory to load vars for managed_node3 24971 1727096417.27394: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.27403: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.27406: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.27408: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.27533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.27644: done with get_vars() 24971 1727096417.27649: variable 'ansible_search_path' from source: unknown 24971 1727096417.27658: we have included files to process 24971 1727096417.27659: generating all_blocks data 24971 1727096417.27660: done generating all_blocks data 24971 1727096417.27661: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.27661: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.27663: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.27763: in VariableManager get_vars() 24971 1727096417.27781: done with get_vars() 24971 1727096417.27853: done processing included file 24971 1727096417.27854: iterating over new_blocks loaded from include file 24971 1727096417.27855: in VariableManager get_vars() 24971 1727096417.27866: done with get_vars() 24971 1727096417.27870: filtering new block on tags 24971 1727096417.27882: done filtering new block on tags 24971 1727096417.27883: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 24971 1727096417.27886: extending task lists for all hosts with included blocks 24971 1727096417.27929: done extending task lists 24971 1727096417.27930: done processing included files 24971 1727096417.27931: results queue empty 24971 1727096417.27931: checking for any_errors_fatal 24971 1727096417.27932: done checking for any_errors_fatal 24971 1727096417.27932: checking for max_fail_percentage 24971 1727096417.27933: done checking for max_fail_percentage 24971 1727096417.27933: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.27934: done checking to see if all hosts have failed 24971 1727096417.27934: getting the remaining hosts for this loop 24971 1727096417.27935: done getting the remaining hosts for this loop 24971 1727096417.27936: getting the next task for host managed_node3 24971 1727096417.27939: done getting next task for host managed_node3 24971 1727096417.27940: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24971 1727096417.27941: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.27943: getting variables 24971 1727096417.27943: in VariableManager get_vars() 24971 1727096417.27952: Calling all_inventory to load vars for managed_node3 24971 1727096417.27953: Calling groups_inventory to load vars for managed_node3 24971 1727096417.27954: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.27957: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.27959: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.27960: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.28062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.28177: done with get_vars() 24971 1727096417.28183: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:00:17 -0400 (0:00:00.019) 0:00:04.760 ****** 24971 1727096417.28227: entering _queue_task() for managed_node3/include_tasks 24971 1727096417.28416: worker is 1 (out of 1 available) 24971 1727096417.28428: exiting _queue_task() for managed_node3/include_tasks 24971 1727096417.28440: done queuing things up, now waiting for results queue to drain 24971 1727096417.28441: waiting for pending results... 24971 1727096417.28782: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 24971 1727096417.28786: in run() - task 0afff68d-5257-3482-6844-00000000012b 24971 1727096417.28790: variable 'ansible_search_path' from source: unknown 24971 1727096417.28792: variable 'ansible_search_path' from source: unknown 24971 1727096417.28808: calling self._execute() 24971 1727096417.28885: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.28912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.29021: variable 'omit' from source: magic vars 24971 1727096417.29300: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.29317: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.29328: _execute() done 24971 1727096417.29335: dumping result to json 24971 1727096417.29352: done dumping result, returning 24971 1727096417.29363: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-3482-6844-00000000012b] 24971 1727096417.29375: sending task result for task 0afff68d-5257-3482-6844-00000000012b 24971 1727096417.29587: no more pending results, returning what we have 24971 1727096417.29592: in VariableManager get_vars() 24971 1727096417.29637: Calling all_inventory to load vars for managed_node3 24971 1727096417.29640: Calling groups_inventory to load vars for managed_node3 24971 1727096417.29643: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.29658: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.29661: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.29665: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.29922: done sending task result for task 0afff68d-5257-3482-6844-00000000012b 24971 1727096417.29925: WORKER PROCESS EXITING 24971 1727096417.29946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.30074: done with get_vars() 24971 1727096417.30080: variable 'ansible_search_path' from source: unknown 24971 1727096417.30081: variable 'ansible_search_path' from source: unknown 24971 1727096417.30106: we have included files to process 24971 1727096417.30107: generating all_blocks data 24971 1727096417.30109: done generating all_blocks data 24971 1727096417.30110: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096417.30111: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096417.30112: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096417.30339: done processing included file 24971 1727096417.30340: iterating over new_blocks loaded from include file 24971 1727096417.30341: in VariableManager get_vars() 24971 1727096417.30352: done with get_vars() 24971 1727096417.30354: filtering new block on tags 24971 1727096417.30364: done filtering new block on tags 24971 1727096417.30365: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 24971 1727096417.30372: extending task lists for all hosts with included blocks 24971 1727096417.30428: done extending task lists 24971 1727096417.30429: done processing included files 24971 1727096417.30430: results queue empty 24971 1727096417.30430: checking for any_errors_fatal 24971 1727096417.30433: done checking for any_errors_fatal 24971 1727096417.30434: checking for max_fail_percentage 24971 1727096417.30435: done checking for max_fail_percentage 24971 1727096417.30436: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.30436: done checking to see if all hosts have failed 24971 1727096417.30437: getting the remaining hosts for this loop 24971 1727096417.30437: done getting the remaining hosts for this loop 24971 1727096417.30439: getting the next task for host managed_node3 24971 1727096417.30441: done getting next task for host managed_node3 24971 1727096417.30443: ^ task is: TASK: Gather current interface info 24971 1727096417.30445: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.30446: getting variables 24971 1727096417.30447: in VariableManager get_vars() 24971 1727096417.30455: Calling all_inventory to load vars for managed_node3 24971 1727096417.30456: Calling groups_inventory to load vars for managed_node3 24971 1727096417.30457: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.30461: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.30462: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.30464: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.30546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.30656: done with get_vars() 24971 1727096417.30663: done getting variables 24971 1727096417.30692: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:00:17 -0400 (0:00:00.024) 0:00:04.784 ****** 24971 1727096417.30711: entering _queue_task() for managed_node3/command 24971 1727096417.30901: worker is 1 (out of 1 available) 24971 1727096417.30916: exiting _queue_task() for managed_node3/command 24971 1727096417.30927: done queuing things up, now waiting for results queue to drain 24971 1727096417.30928: waiting for pending results... 24971 1727096417.31059: running TaskExecutor() for managed_node3/TASK: Gather current interface info 24971 1727096417.31130: in run() - task 0afff68d-5257-3482-6844-00000000013a 24971 1727096417.31140: variable 'ansible_search_path' from source: unknown 24971 1727096417.31143: variable 'ansible_search_path' from source: unknown 24971 1727096417.31182: calling self._execute() 24971 1727096417.31237: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.31240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.31250: variable 'omit' from source: magic vars 24971 1727096417.31639: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.31674: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.31677: variable 'omit' from source: magic vars 24971 1727096417.31707: variable 'omit' from source: magic vars 24971 1727096417.31780: variable 'omit' from source: magic vars 24971 1727096417.31797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096417.31836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096417.31863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096417.31900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.31916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.31951: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096417.31961: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.31969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.32105: Set connection var ansible_shell_type to sh 24971 1727096417.32108: Set connection var ansible_shell_executable to /bin/sh 24971 1727096417.32111: Set connection var ansible_timeout to 10 24971 1727096417.32171: Set connection var ansible_connection to ssh 24971 1727096417.32174: Set connection var ansible_pipelining to False 24971 1727096417.32176: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096417.32178: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.32181: variable 'ansible_connection' from source: unknown 24971 1727096417.32193: variable 'ansible_module_compression' from source: unknown 24971 1727096417.32196: variable 'ansible_shell_type' from source: unknown 24971 1727096417.32198: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.32201: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.32203: variable 'ansible_pipelining' from source: unknown 24971 1727096417.32205: variable 'ansible_timeout' from source: unknown 24971 1727096417.32214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.32360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096417.32384: variable 'omit' from source: magic vars 24971 1727096417.32396: starting attempt loop 24971 1727096417.32402: running the handler 24971 1727096417.32424: _low_level_execute_command(): starting 24971 1727096417.32437: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096417.33215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096417.33310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.33350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.33372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.33415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.33462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.35790: stdout chunk (state=3): >>>/root <<< 24971 1727096417.35937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.35973: stderr chunk (state=3): >>><<< 24971 1727096417.35976: stdout chunk (state=3): >>><<< 24971 1727096417.35990: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096417.36002: _low_level_execute_command(): starting 24971 1727096417.36008: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235 `" && echo ansible-tmp-1727096417.359895-25206-160358430947235="` echo /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235 `" ) && sleep 0' 24971 1727096417.36429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096417.36436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096417.36460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.36464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096417.36478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.36525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.36528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.36571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.39312: stdout chunk (state=3): >>>ansible-tmp-1727096417.359895-25206-160358430947235=/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235 <<< 24971 1727096417.39488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.39524: stderr chunk (state=3): >>><<< 24971 1727096417.39527: stdout chunk (state=3): >>><<< 24971 1727096417.39540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096417.359895-25206-160358430947235=/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096417.39566: variable 'ansible_module_compression' from source: unknown 24971 1727096417.39615: ANSIBALLZ: Using generic lock for ansible.legacy.command 24971 1727096417.39618: ANSIBALLZ: Acquiring lock 24971 1727096417.39621: ANSIBALLZ: Lock acquired: 139839577444416 24971 1727096417.39623: ANSIBALLZ: Creating module 24971 1727096417.47251: ANSIBALLZ: Writing module into payload 24971 1727096417.47313: ANSIBALLZ: Writing module 24971 1727096417.47330: ANSIBALLZ: Renaming module 24971 1727096417.47336: ANSIBALLZ: Done creating module 24971 1727096417.47351: variable 'ansible_facts' from source: unknown 24971 1727096417.47401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py 24971 1727096417.47504: Sending initial data 24971 1727096417.47508: Sent initial data (155 bytes) 24971 1727096417.47980: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096417.47984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096417.47986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.47988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096417.47990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.48034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.48049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.48088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.50288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096417.50318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096417.50358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpf29cra04 /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py <<< 24971 1727096417.50361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py" <<< 24971 1727096417.50391: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpf29cra04" to remote "/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py" <<< 24971 1727096417.50395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py" <<< 24971 1727096417.50900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.50948: stderr chunk (state=3): >>><<< 24971 1727096417.50953: stdout chunk (state=3): >>><<< 24971 1727096417.50980: done transferring module to remote 24971 1727096417.50989: _low_level_execute_command(): starting 24971 1727096417.50994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/ /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py && sleep 0' 24971 1727096417.51461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096417.51465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096417.51469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.51472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096417.51474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096417.51481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.51529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.51532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.51536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.51574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.54156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.54160: stdout chunk (state=3): >>><<< 24971 1727096417.54163: stderr chunk (state=3): >>><<< 24971 1727096417.54188: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096417.54192: _low_level_execute_command(): starting 24971 1727096417.54194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/AnsiballZ_command.py && sleep 0' 24971 1727096417.54639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096417.54643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096417.54677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096417.54680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.54682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096417.54684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.54730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.54735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.54748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.54795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.78988: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:17.781814", "end": "2024-09-23 09:00:17.786476", "delta": "0:00:00.004662", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096417.81095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096417.81099: stdout chunk (state=3): >>><<< 24971 1727096417.81102: stderr chunk (state=3): >>><<< 24971 1727096417.81202: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:17.781814", "end": "2024-09-23 09:00:17.786476", "delta": "0:00:00.004662", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096417.81206: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096417.81209: _low_level_execute_command(): starting 24971 1727096417.81211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096417.359895-25206-160358430947235/ > /dev/null 2>&1 && sleep 0' 24971 1727096417.81870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096417.81958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096417.81985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096417.82003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096417.82025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096417.82101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24971 1727096417.84709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096417.84740: stderr chunk (state=3): >>><<< 24971 1727096417.84743: stdout chunk (state=3): >>><<< 24971 1727096417.84758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24971 1727096417.84764: handler run complete 24971 1727096417.84792: Evaluated conditional (False): False 24971 1727096417.84796: attempt loop complete, returning result 24971 1727096417.84799: _execute() done 24971 1727096417.84801: dumping result to json 24971 1727096417.84807: done dumping result, returning 24971 1727096417.84817: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-3482-6844-00000000013a] 24971 1727096417.84824: sending task result for task 0afff68d-5257-3482-6844-00000000013a 24971 1727096417.84911: done sending task result for task 0afff68d-5257-3482-6844-00000000013a 24971 1727096417.84913: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004662", "end": "2024-09-23 09:00:17.786476", "rc": 0, "start": "2024-09-23 09:00:17.781814" } STDOUT: bonding_masters eth0 lo 24971 1727096417.84996: no more pending results, returning what we have 24971 1727096417.84999: results queue empty 24971 1727096417.84999: checking for any_errors_fatal 24971 1727096417.85001: done checking for any_errors_fatal 24971 1727096417.85002: checking for max_fail_percentage 24971 1727096417.85003: done checking for max_fail_percentage 24971 1727096417.85004: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.85004: done checking to see if all hosts have failed 24971 1727096417.85005: getting the remaining hosts for this loop 24971 1727096417.85006: done getting the remaining hosts for this loop 24971 1727096417.85009: getting the next task for host managed_node3 24971 1727096417.85015: done getting next task for host managed_node3 24971 1727096417.85017: ^ task is: TASK: Set current_interfaces 24971 1727096417.85021: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.85027: getting variables 24971 1727096417.85028: in VariableManager get_vars() 24971 1727096417.85068: Calling all_inventory to load vars for managed_node3 24971 1727096417.85071: Calling groups_inventory to load vars for managed_node3 24971 1727096417.85073: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.85084: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.85087: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.85089: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.85256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.85382: done with get_vars() 24971 1727096417.85393: done getting variables 24971 1727096417.85434: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:00:17 -0400 (0:00:00.547) 0:00:05.332 ****** 24971 1727096417.85457: entering _queue_task() for managed_node3/set_fact 24971 1727096417.85653: worker is 1 (out of 1 available) 24971 1727096417.85665: exiting _queue_task() for managed_node3/set_fact 24971 1727096417.85681: done queuing things up, now waiting for results queue to drain 24971 1727096417.85682: waiting for pending results... 24971 1727096417.85818: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 24971 1727096417.85887: in run() - task 0afff68d-5257-3482-6844-00000000013b 24971 1727096417.85897: variable 'ansible_search_path' from source: unknown 24971 1727096417.85902: variable 'ansible_search_path' from source: unknown 24971 1727096417.85931: calling self._execute() 24971 1727096417.85991: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.85995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.86002: variable 'omit' from source: magic vars 24971 1727096417.86272: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.86281: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.86286: variable 'omit' from source: magic vars 24971 1727096417.86316: variable 'omit' from source: magic vars 24971 1727096417.86389: variable '_current_interfaces' from source: set_fact 24971 1727096417.86437: variable 'omit' from source: magic vars 24971 1727096417.86473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096417.86497: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096417.86513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096417.86526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.86535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.86559: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096417.86563: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.86566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.86630: Set connection var ansible_shell_type to sh 24971 1727096417.86637: Set connection var ansible_shell_executable to /bin/sh 24971 1727096417.86646: Set connection var ansible_timeout to 10 24971 1727096417.86651: Set connection var ansible_connection to ssh 24971 1727096417.86656: Set connection var ansible_pipelining to False 24971 1727096417.86661: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096417.86680: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.86685: variable 'ansible_connection' from source: unknown 24971 1727096417.86688: variable 'ansible_module_compression' from source: unknown 24971 1727096417.86690: variable 'ansible_shell_type' from source: unknown 24971 1727096417.86692: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.86694: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.86696: variable 'ansible_pipelining' from source: unknown 24971 1727096417.86699: variable 'ansible_timeout' from source: unknown 24971 1727096417.86700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.86800: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096417.86808: variable 'omit' from source: magic vars 24971 1727096417.86812: starting attempt loop 24971 1727096417.86815: running the handler 24971 1727096417.86829: handler run complete 24971 1727096417.86835: attempt loop complete, returning result 24971 1727096417.86838: _execute() done 24971 1727096417.86840: dumping result to json 24971 1727096417.86843: done dumping result, returning 24971 1727096417.86850: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-3482-6844-00000000013b] 24971 1727096417.86853: sending task result for task 0afff68d-5257-3482-6844-00000000013b 24971 1727096417.86929: done sending task result for task 0afff68d-5257-3482-6844-00000000013b 24971 1727096417.86933: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24971 1727096417.86990: no more pending results, returning what we have 24971 1727096417.86992: results queue empty 24971 1727096417.86993: checking for any_errors_fatal 24971 1727096417.86999: done checking for any_errors_fatal 24971 1727096417.86999: checking for max_fail_percentage 24971 1727096417.87001: done checking for max_fail_percentage 24971 1727096417.87001: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.87002: done checking to see if all hosts have failed 24971 1727096417.87003: getting the remaining hosts for this loop 24971 1727096417.87004: done getting the remaining hosts for this loop 24971 1727096417.87007: getting the next task for host managed_node3 24971 1727096417.87013: done getting next task for host managed_node3 24971 1727096417.87015: ^ task is: TASK: Show current_interfaces 24971 1727096417.87017: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.87020: getting variables 24971 1727096417.87021: in VariableManager get_vars() 24971 1727096417.87053: Calling all_inventory to load vars for managed_node3 24971 1727096417.87055: Calling groups_inventory to load vars for managed_node3 24971 1727096417.87057: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.87066: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.87080: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.87083: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.87199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.87317: done with get_vars() 24971 1727096417.87324: done getting variables 24971 1727096417.87391: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:00:17 -0400 (0:00:00.019) 0:00:05.351 ****** 24971 1727096417.87414: entering _queue_task() for managed_node3/debug 24971 1727096417.87415: Creating lock for debug 24971 1727096417.87606: worker is 1 (out of 1 available) 24971 1727096417.87618: exiting _queue_task() for managed_node3/debug 24971 1727096417.87629: done queuing things up, now waiting for results queue to drain 24971 1727096417.87630: waiting for pending results... 24971 1727096417.87776: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 24971 1727096417.87830: in run() - task 0afff68d-5257-3482-6844-00000000012c 24971 1727096417.87840: variable 'ansible_search_path' from source: unknown 24971 1727096417.87844: variable 'ansible_search_path' from source: unknown 24971 1727096417.87880: calling self._execute() 24971 1727096417.87999: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.88004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.88013: variable 'omit' from source: magic vars 24971 1727096417.88261: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.88275: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.88282: variable 'omit' from source: magic vars 24971 1727096417.88312: variable 'omit' from source: magic vars 24971 1727096417.88378: variable 'current_interfaces' from source: set_fact 24971 1727096417.88404: variable 'omit' from source: magic vars 24971 1727096417.88432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096417.88458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096417.88477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096417.88490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.88498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096417.88526: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096417.88529: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.88531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.88597: Set connection var ansible_shell_type to sh 24971 1727096417.88604: Set connection var ansible_shell_executable to /bin/sh 24971 1727096417.88613: Set connection var ansible_timeout to 10 24971 1727096417.88616: Set connection var ansible_connection to ssh 24971 1727096417.88625: Set connection var ansible_pipelining to False 24971 1727096417.88628: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096417.88645: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.88649: variable 'ansible_connection' from source: unknown 24971 1727096417.88651: variable 'ansible_module_compression' from source: unknown 24971 1727096417.88653: variable 'ansible_shell_type' from source: unknown 24971 1727096417.88656: variable 'ansible_shell_executable' from source: unknown 24971 1727096417.88658: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.88660: variable 'ansible_pipelining' from source: unknown 24971 1727096417.88662: variable 'ansible_timeout' from source: unknown 24971 1727096417.88666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.88773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096417.88784: variable 'omit' from source: magic vars 24971 1727096417.88787: starting attempt loop 24971 1727096417.88790: running the handler 24971 1727096417.88825: handler run complete 24971 1727096417.88837: attempt loop complete, returning result 24971 1727096417.88840: _execute() done 24971 1727096417.88842: dumping result to json 24971 1727096417.88846: done dumping result, returning 24971 1727096417.88850: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-3482-6844-00000000012c] 24971 1727096417.88852: sending task result for task 0afff68d-5257-3482-6844-00000000012c 24971 1727096417.88932: done sending task result for task 0afff68d-5257-3482-6844-00000000012c 24971 1727096417.88935: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24971 1727096417.89060: no more pending results, returning what we have 24971 1727096417.89063: results queue empty 24971 1727096417.89064: checking for any_errors_fatal 24971 1727096417.89067: done checking for any_errors_fatal 24971 1727096417.89069: checking for max_fail_percentage 24971 1727096417.89071: done checking for max_fail_percentage 24971 1727096417.89072: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.89072: done checking to see if all hosts have failed 24971 1727096417.89073: getting the remaining hosts for this loop 24971 1727096417.89074: done getting the remaining hosts for this loop 24971 1727096417.89077: getting the next task for host managed_node3 24971 1727096417.89082: done getting next task for host managed_node3 24971 1727096417.89084: ^ task is: TASK: Include the task 'manage_test_interface.yml' 24971 1727096417.89086: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.89089: getting variables 24971 1727096417.89090: in VariableManager get_vars() 24971 1727096417.89116: Calling all_inventory to load vars for managed_node3 24971 1727096417.89123: Calling groups_inventory to load vars for managed_node3 24971 1727096417.89126: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.89133: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.89134: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.89136: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.89234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.89347: done with get_vars() 24971 1727096417.89354: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Monday 23 September 2024 09:00:17 -0400 (0:00:00.019) 0:00:05.371 ****** 24971 1727096417.89414: entering _queue_task() for managed_node3/include_tasks 24971 1727096417.89598: worker is 1 (out of 1 available) 24971 1727096417.89610: exiting _queue_task() for managed_node3/include_tasks 24971 1727096417.89623: done queuing things up, now waiting for results queue to drain 24971 1727096417.89624: waiting for pending results... 24971 1727096417.89767: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 24971 1727096417.89823: in run() - task 0afff68d-5257-3482-6844-00000000000c 24971 1727096417.89833: variable 'ansible_search_path' from source: unknown 24971 1727096417.89866: calling self._execute() 24971 1727096417.89922: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.89926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.89935: variable 'omit' from source: magic vars 24971 1727096417.90193: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.90204: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.90209: _execute() done 24971 1727096417.90212: dumping result to json 24971 1727096417.90215: done dumping result, returning 24971 1727096417.90221: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-3482-6844-00000000000c] 24971 1727096417.90225: sending task result for task 0afff68d-5257-3482-6844-00000000000c 24971 1727096417.90306: done sending task result for task 0afff68d-5257-3482-6844-00000000000c 24971 1727096417.90309: WORKER PROCESS EXITING 24971 1727096417.90335: no more pending results, returning what we have 24971 1727096417.90339: in VariableManager get_vars() 24971 1727096417.90384: Calling all_inventory to load vars for managed_node3 24971 1727096417.90387: Calling groups_inventory to load vars for managed_node3 24971 1727096417.90389: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.90398: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.90400: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.90402: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.90527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.90659: done with get_vars() 24971 1727096417.90664: variable 'ansible_search_path' from source: unknown 24971 1727096417.90678: we have included files to process 24971 1727096417.90679: generating all_blocks data 24971 1727096417.90680: done generating all_blocks data 24971 1727096417.90684: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096417.90684: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096417.90686: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096417.91007: in VariableManager get_vars() 24971 1727096417.91021: done with get_vars() 24971 1727096417.91165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 24971 1727096417.91546: done processing included file 24971 1727096417.91548: iterating over new_blocks loaded from include file 24971 1727096417.91549: in VariableManager get_vars() 24971 1727096417.91561: done with get_vars() 24971 1727096417.91562: filtering new block on tags 24971 1727096417.91585: done filtering new block on tags 24971 1727096417.91587: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 24971 1727096417.91590: extending task lists for all hosts with included blocks 24971 1727096417.91684: done extending task lists 24971 1727096417.91685: done processing included files 24971 1727096417.91685: results queue empty 24971 1727096417.91686: checking for any_errors_fatal 24971 1727096417.91688: done checking for any_errors_fatal 24971 1727096417.91688: checking for max_fail_percentage 24971 1727096417.91689: done checking for max_fail_percentage 24971 1727096417.91689: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.91690: done checking to see if all hosts have failed 24971 1727096417.91690: getting the remaining hosts for this loop 24971 1727096417.91691: done getting the remaining hosts for this loop 24971 1727096417.91692: getting the next task for host managed_node3 24971 1727096417.91695: done getting next task for host managed_node3 24971 1727096417.91696: ^ task is: TASK: Ensure state in ["present", "absent"] 24971 1727096417.91697: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.91699: getting variables 24971 1727096417.91699: in VariableManager get_vars() 24971 1727096417.91708: Calling all_inventory to load vars for managed_node3 24971 1727096417.91709: Calling groups_inventory to load vars for managed_node3 24971 1727096417.91710: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.91715: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.91716: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.91718: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.91801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.91912: done with get_vars() 24971 1727096417.91918: done getting variables 24971 1727096417.91959: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:00:17 -0400 (0:00:00.025) 0:00:05.397 ****** 24971 1727096417.91984: entering _queue_task() for managed_node3/fail 24971 1727096417.91985: Creating lock for fail 24971 1727096417.92196: worker is 1 (out of 1 available) 24971 1727096417.92207: exiting _queue_task() for managed_node3/fail 24971 1727096417.92220: done queuing things up, now waiting for results queue to drain 24971 1727096417.92221: waiting for pending results... 24971 1727096417.92361: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 24971 1727096417.92422: in run() - task 0afff68d-5257-3482-6844-000000000156 24971 1727096417.92432: variable 'ansible_search_path' from source: unknown 24971 1727096417.92436: variable 'ansible_search_path' from source: unknown 24971 1727096417.92466: calling self._execute() 24971 1727096417.92535: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.92539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.92547: variable 'omit' from source: magic vars 24971 1727096417.92852: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.92862: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.92957: variable 'state' from source: include params 24971 1727096417.92961: Evaluated conditional (state not in ["present", "absent"]): False 24971 1727096417.92963: when evaluation is False, skipping this task 24971 1727096417.92966: _execute() done 24971 1727096417.92971: dumping result to json 24971 1727096417.92977: done dumping result, returning 24971 1727096417.92984: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-3482-6844-000000000156] 24971 1727096417.92988: sending task result for task 0afff68d-5257-3482-6844-000000000156 24971 1727096417.93072: done sending task result for task 0afff68d-5257-3482-6844-000000000156 24971 1727096417.93075: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 24971 1727096417.93145: no more pending results, returning what we have 24971 1727096417.93148: results queue empty 24971 1727096417.93149: checking for any_errors_fatal 24971 1727096417.93150: done checking for any_errors_fatal 24971 1727096417.93151: checking for max_fail_percentage 24971 1727096417.93152: done checking for max_fail_percentage 24971 1727096417.93152: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.93153: done checking to see if all hosts have failed 24971 1727096417.93154: getting the remaining hosts for this loop 24971 1727096417.93155: done getting the remaining hosts for this loop 24971 1727096417.93158: getting the next task for host managed_node3 24971 1727096417.93163: done getting next task for host managed_node3 24971 1727096417.93165: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 24971 1727096417.93169: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.93172: getting variables 24971 1727096417.93174: in VariableManager get_vars() 24971 1727096417.93204: Calling all_inventory to load vars for managed_node3 24971 1727096417.93206: Calling groups_inventory to load vars for managed_node3 24971 1727096417.93208: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.93217: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.93220: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.93222: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.93357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.93476: done with get_vars() 24971 1727096417.93483: done getting variables 24971 1727096417.93521: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:00:17 -0400 (0:00:00.015) 0:00:05.413 ****** 24971 1727096417.93539: entering _queue_task() for managed_node3/fail 24971 1727096417.93730: worker is 1 (out of 1 available) 24971 1727096417.93742: exiting _queue_task() for managed_node3/fail 24971 1727096417.93754: done queuing things up, now waiting for results queue to drain 24971 1727096417.93755: waiting for pending results... 24971 1727096417.93899: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 24971 1727096417.93956: in run() - task 0afff68d-5257-3482-6844-000000000157 24971 1727096417.93969: variable 'ansible_search_path' from source: unknown 24971 1727096417.93973: variable 'ansible_search_path' from source: unknown 24971 1727096417.94003: calling self._execute() 24971 1727096417.94062: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.94066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.94078: variable 'omit' from source: magic vars 24971 1727096417.94475: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.94479: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.94582: variable 'type' from source: play vars 24971 1727096417.94593: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 24971 1727096417.94600: when evaluation is False, skipping this task 24971 1727096417.94607: _execute() done 24971 1727096417.94612: dumping result to json 24971 1727096417.94618: done dumping result, returning 24971 1727096417.94627: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-3482-6844-000000000157] 24971 1727096417.94634: sending task result for task 0afff68d-5257-3482-6844-000000000157 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 24971 1727096417.94810: no more pending results, returning what we have 24971 1727096417.94813: results queue empty 24971 1727096417.94814: checking for any_errors_fatal 24971 1727096417.94818: done checking for any_errors_fatal 24971 1727096417.94819: checking for max_fail_percentage 24971 1727096417.94820: done checking for max_fail_percentage 24971 1727096417.94821: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.94822: done checking to see if all hosts have failed 24971 1727096417.94822: getting the remaining hosts for this loop 24971 1727096417.94824: done getting the remaining hosts for this loop 24971 1727096417.94827: getting the next task for host managed_node3 24971 1727096417.94833: done getting next task for host managed_node3 24971 1727096417.94835: ^ task is: TASK: Include the task 'show_interfaces.yml' 24971 1727096417.94838: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.94841: getting variables 24971 1727096417.94843: in VariableManager get_vars() 24971 1727096417.94880: Calling all_inventory to load vars for managed_node3 24971 1727096417.94883: Calling groups_inventory to load vars for managed_node3 24971 1727096417.94885: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.94894: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.94896: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.94898: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.95086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.95502: done with get_vars() 24971 1727096417.95509: done getting variables 24971 1727096417.95553: done sending task result for task 0afff68d-5257-3482-6844-000000000157 24971 1727096417.95556: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:00:17 -0400 (0:00:00.020) 0:00:05.433 ****** 24971 1727096417.95588: entering _queue_task() for managed_node3/include_tasks 24971 1727096417.95775: worker is 1 (out of 1 available) 24971 1727096417.95787: exiting _queue_task() for managed_node3/include_tasks 24971 1727096417.95799: done queuing things up, now waiting for results queue to drain 24971 1727096417.95800: waiting for pending results... 24971 1727096417.95939: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 24971 1727096417.96013: in run() - task 0afff68d-5257-3482-6844-000000000158 24971 1727096417.96025: variable 'ansible_search_path' from source: unknown 24971 1727096417.96028: variable 'ansible_search_path' from source: unknown 24971 1727096417.96060: calling self._execute() 24971 1727096417.96121: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.96125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.96135: variable 'omit' from source: magic vars 24971 1727096417.96405: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.96415: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.96421: _execute() done 24971 1727096417.96424: dumping result to json 24971 1727096417.96426: done dumping result, returning 24971 1727096417.96432: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-3482-6844-000000000158] 24971 1727096417.96437: sending task result for task 0afff68d-5257-3482-6844-000000000158 24971 1727096417.96520: done sending task result for task 0afff68d-5257-3482-6844-000000000158 24971 1727096417.96523: WORKER PROCESS EXITING 24971 1727096417.96550: no more pending results, returning what we have 24971 1727096417.96555: in VariableManager get_vars() 24971 1727096417.96598: Calling all_inventory to load vars for managed_node3 24971 1727096417.96600: Calling groups_inventory to load vars for managed_node3 24971 1727096417.96602: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.96611: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.96614: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.96616: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.96740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.96856: done with get_vars() 24971 1727096417.96861: variable 'ansible_search_path' from source: unknown 24971 1727096417.96862: variable 'ansible_search_path' from source: unknown 24971 1727096417.96892: we have included files to process 24971 1727096417.96893: generating all_blocks data 24971 1727096417.96894: done generating all_blocks data 24971 1727096417.96897: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.96897: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.96899: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096417.96963: in VariableManager get_vars() 24971 1727096417.96982: done with get_vars() 24971 1727096417.97056: done processing included file 24971 1727096417.97057: iterating over new_blocks loaded from include file 24971 1727096417.97058: in VariableManager get_vars() 24971 1727096417.97092: done with get_vars() 24971 1727096417.97094: filtering new block on tags 24971 1727096417.97111: done filtering new block on tags 24971 1727096417.97114: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 24971 1727096417.97117: extending task lists for all hosts with included blocks 24971 1727096417.97473: done extending task lists 24971 1727096417.97475: done processing included files 24971 1727096417.97476: results queue empty 24971 1727096417.97477: checking for any_errors_fatal 24971 1727096417.97479: done checking for any_errors_fatal 24971 1727096417.97480: checking for max_fail_percentage 24971 1727096417.97481: done checking for max_fail_percentage 24971 1727096417.97482: checking to see if all hosts have failed and the running result is not ok 24971 1727096417.97482: done checking to see if all hosts have failed 24971 1727096417.97483: getting the remaining hosts for this loop 24971 1727096417.97484: done getting the remaining hosts for this loop 24971 1727096417.97487: getting the next task for host managed_node3 24971 1727096417.97490: done getting next task for host managed_node3 24971 1727096417.97492: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24971 1727096417.97495: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096417.97497: getting variables 24971 1727096417.97498: in VariableManager get_vars() 24971 1727096417.97538: Calling all_inventory to load vars for managed_node3 24971 1727096417.97541: Calling groups_inventory to load vars for managed_node3 24971 1727096417.97544: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.97549: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.97551: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.97554: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.97686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096417.97862: done with get_vars() 24971 1727096417.97875: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:00:17 -0400 (0:00:00.023) 0:00:05.457 ****** 24971 1727096417.97938: entering _queue_task() for managed_node3/include_tasks 24971 1727096417.98271: worker is 1 (out of 1 available) 24971 1727096417.98281: exiting _queue_task() for managed_node3/include_tasks 24971 1727096417.98291: done queuing things up, now waiting for results queue to drain 24971 1727096417.98291: waiting for pending results... 24971 1727096417.98588: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 24971 1727096417.98594: in run() - task 0afff68d-5257-3482-6844-00000000017f 24971 1727096417.98598: variable 'ansible_search_path' from source: unknown 24971 1727096417.98601: variable 'ansible_search_path' from source: unknown 24971 1727096417.98686: calling self._execute() 24971 1727096417.98738: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096417.98750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096417.98765: variable 'omit' from source: magic vars 24971 1727096417.99156: variable 'ansible_distribution_major_version' from source: facts 24971 1727096417.99181: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096417.99194: _execute() done 24971 1727096417.99229: dumping result to json 24971 1727096417.99232: done dumping result, returning 24971 1727096417.99235: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-3482-6844-00000000017f] 24971 1727096417.99238: sending task result for task 0afff68d-5257-3482-6844-00000000017f 24971 1727096417.99497: no more pending results, returning what we have 24971 1727096417.99502: in VariableManager get_vars() 24971 1727096417.99541: Calling all_inventory to load vars for managed_node3 24971 1727096417.99544: Calling groups_inventory to load vars for managed_node3 24971 1727096417.99547: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096417.99558: Calling all_plugins_play to load vars for managed_node3 24971 1727096417.99561: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096417.99565: Calling groups_plugins_play to load vars for managed_node3 24971 1727096417.99807: done sending task result for task 0afff68d-5257-3482-6844-00000000017f 24971 1727096417.99810: WORKER PROCESS EXITING 24971 1727096417.99831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096418.00048: done with get_vars() 24971 1727096418.00055: variable 'ansible_search_path' from source: unknown 24971 1727096418.00057: variable 'ansible_search_path' from source: unknown 24971 1727096418.00113: we have included files to process 24971 1727096418.00115: generating all_blocks data 24971 1727096418.00116: done generating all_blocks data 24971 1727096418.00117: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096418.00118: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096418.00120: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096418.00366: done processing included file 24971 1727096418.00372: iterating over new_blocks loaded from include file 24971 1727096418.00374: in VariableManager get_vars() 24971 1727096418.00393: done with get_vars() 24971 1727096418.00395: filtering new block on tags 24971 1727096418.00412: done filtering new block on tags 24971 1727096418.00414: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 24971 1727096418.00418: extending task lists for all hosts with included blocks 24971 1727096418.00550: done extending task lists 24971 1727096418.00551: done processing included files 24971 1727096418.00551: results queue empty 24971 1727096418.00552: checking for any_errors_fatal 24971 1727096418.00554: done checking for any_errors_fatal 24971 1727096418.00555: checking for max_fail_percentage 24971 1727096418.00555: done checking for max_fail_percentage 24971 1727096418.00556: checking to see if all hosts have failed and the running result is not ok 24971 1727096418.00556: done checking to see if all hosts have failed 24971 1727096418.00557: getting the remaining hosts for this loop 24971 1727096418.00557: done getting the remaining hosts for this loop 24971 1727096418.00559: getting the next task for host managed_node3 24971 1727096418.00562: done getting next task for host managed_node3 24971 1727096418.00563: ^ task is: TASK: Gather current interface info 24971 1727096418.00565: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096418.00566: getting variables 24971 1727096418.00571: in VariableManager get_vars() 24971 1727096418.00580: Calling all_inventory to load vars for managed_node3 24971 1727096418.00581: Calling groups_inventory to load vars for managed_node3 24971 1727096418.00583: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096418.00586: Calling all_plugins_play to load vars for managed_node3 24971 1727096418.00588: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096418.00591: Calling groups_plugins_play to load vars for managed_node3 24971 1727096418.00681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096418.00795: done with get_vars() 24971 1727096418.00801: done getting variables 24971 1727096418.00829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:00:18 -0400 (0:00:00.029) 0:00:05.486 ****** 24971 1727096418.00850: entering _queue_task() for managed_node3/command 24971 1727096418.01046: worker is 1 (out of 1 available) 24971 1727096418.01059: exiting _queue_task() for managed_node3/command 24971 1727096418.01074: done queuing things up, now waiting for results queue to drain 24971 1727096418.01075: waiting for pending results... 24971 1727096418.01222: running TaskExecutor() for managed_node3/TASK: Gather current interface info 24971 1727096418.01297: in run() - task 0afff68d-5257-3482-6844-0000000001b6 24971 1727096418.01313: variable 'ansible_search_path' from source: unknown 24971 1727096418.01317: variable 'ansible_search_path' from source: unknown 24971 1727096418.01339: calling self._execute() 24971 1727096418.01400: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.01403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.01416: variable 'omit' from source: magic vars 24971 1727096418.01722: variable 'ansible_distribution_major_version' from source: facts 24971 1727096418.01733: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096418.01737: variable 'omit' from source: magic vars 24971 1727096418.01778: variable 'omit' from source: magic vars 24971 1727096418.01801: variable 'omit' from source: magic vars 24971 1727096418.01832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096418.01862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096418.01879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096418.01893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.01903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.01925: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096418.01928: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.01931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.02002: Set connection var ansible_shell_type to sh 24971 1727096418.02008: Set connection var ansible_shell_executable to /bin/sh 24971 1727096418.02016: Set connection var ansible_timeout to 10 24971 1727096418.02021: Set connection var ansible_connection to ssh 24971 1727096418.02026: Set connection var ansible_pipelining to False 24971 1727096418.02031: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096418.02046: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.02050: variable 'ansible_connection' from source: unknown 24971 1727096418.02052: variable 'ansible_module_compression' from source: unknown 24971 1727096418.02055: variable 'ansible_shell_type' from source: unknown 24971 1727096418.02057: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.02059: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.02061: variable 'ansible_pipelining' from source: unknown 24971 1727096418.02065: variable 'ansible_timeout' from source: unknown 24971 1727096418.02081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.02184: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096418.02200: variable 'omit' from source: magic vars 24971 1727096418.02203: starting attempt loop 24971 1727096418.02205: running the handler 24971 1727096418.02363: _low_level_execute_command(): starting 24971 1727096418.02366: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096418.02992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.03015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.03031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.03104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.05309: stdout chunk (state=3): >>>/root <<< 24971 1727096418.05445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.05476: stderr chunk (state=3): >>><<< 24971 1727096418.05480: stdout chunk (state=3): >>><<< 24971 1727096418.05506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.05517: _low_level_execute_command(): starting 24971 1727096418.05522: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324 `" && echo ansible-tmp-1727096418.055061-25233-248358587106324="` echo /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324 `" ) && sleep 0' 24971 1727096418.06041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.06054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.06125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.08463: stdout chunk (state=3): >>>ansible-tmp-1727096418.055061-25233-248358587106324=/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324 <<< 24971 1727096418.08610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.08638: stderr chunk (state=3): >>><<< 24971 1727096418.08641: stdout chunk (state=3): >>><<< 24971 1727096418.08657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096418.055061-25233-248358587106324=/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.08690: variable 'ansible_module_compression' from source: unknown 24971 1727096418.08734: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096418.08764: variable 'ansible_facts' from source: unknown 24971 1727096418.08825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py 24971 1727096418.08934: Sending initial data 24971 1727096418.08938: Sent initial data (155 bytes) 24971 1727096418.09545: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.09605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.09618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.09659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.09699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.11691: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096418.11720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096418.11753: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkgiqno92 /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py <<< 24971 1727096418.11766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py" <<< 24971 1727096418.11791: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkgiqno92" to remote "/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py" <<< 24971 1727096418.11799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py" <<< 24971 1727096418.12285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.12329: stderr chunk (state=3): >>><<< 24971 1727096418.12332: stdout chunk (state=3): >>><<< 24971 1727096418.12376: done transferring module to remote 24971 1727096418.12384: _low_level_execute_command(): starting 24971 1727096418.12389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/ /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py && sleep 0' 24971 1727096418.12872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.12906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096418.12925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096418.12937: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096418.12954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.12992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096418.13083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.13117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.13174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.15654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.15688: stderr chunk (state=3): >>><<< 24971 1727096418.15691: stdout chunk (state=3): >>><<< 24971 1727096418.15705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.15708: _low_level_execute_command(): starting 24971 1727096418.15714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/AnsiballZ_command.py && sleep 0' 24971 1727096418.16154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.16157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096418.16160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.16162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.16209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.16212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.16261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.39860: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:18.389965", "end": "2024-09-23 09:00:18.394540", "delta": "0:00:00.004575", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096418.41859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096418.41863: stdout chunk (state=3): >>><<< 24971 1727096418.41866: stderr chunk (state=3): >>><<< 24971 1727096418.41873: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:18.389965", "end": "2024-09-23 09:00:18.394540", "delta": "0:00:00.004575", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096418.41917: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096418.42077: _low_level_execute_command(): starting 24971 1727096418.42081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096418.055061-25233-248358587106324/ > /dev/null 2>&1 && sleep 0' 24971 1727096418.43078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.43105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096418.43118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096418.43161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.43203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.43206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.43251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.45254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.45258: stdout chunk (state=3): >>><<< 24971 1727096418.45260: stderr chunk (state=3): >>><<< 24971 1727096418.45279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.45473: handler run complete 24971 1727096418.45477: Evaluated conditional (False): False 24971 1727096418.45479: attempt loop complete, returning result 24971 1727096418.45481: _execute() done 24971 1727096418.45483: dumping result to json 24971 1727096418.45485: done dumping result, returning 24971 1727096418.45487: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-3482-6844-0000000001b6] 24971 1727096418.45489: sending task result for task 0afff68d-5257-3482-6844-0000000001b6 24971 1727096418.45560: done sending task result for task 0afff68d-5257-3482-6844-0000000001b6 24971 1727096418.45563: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004575", "end": "2024-09-23 09:00:18.394540", "rc": 0, "start": "2024-09-23 09:00:18.389965" } STDOUT: bonding_masters eth0 lo 24971 1727096418.45649: no more pending results, returning what we have 24971 1727096418.45652: results queue empty 24971 1727096418.45653: checking for any_errors_fatal 24971 1727096418.45655: done checking for any_errors_fatal 24971 1727096418.45655: checking for max_fail_percentage 24971 1727096418.45657: done checking for max_fail_percentage 24971 1727096418.45658: checking to see if all hosts have failed and the running result is not ok 24971 1727096418.45659: done checking to see if all hosts have failed 24971 1727096418.45659: getting the remaining hosts for this loop 24971 1727096418.45661: done getting the remaining hosts for this loop 24971 1727096418.45664: getting the next task for host managed_node3 24971 1727096418.45679: done getting next task for host managed_node3 24971 1727096418.45682: ^ task is: TASK: Set current_interfaces 24971 1727096418.45687: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096418.45691: getting variables 24971 1727096418.45693: in VariableManager get_vars() 24971 1727096418.45908: Calling all_inventory to load vars for managed_node3 24971 1727096418.45911: Calling groups_inventory to load vars for managed_node3 24971 1727096418.45914: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096418.45926: Calling all_plugins_play to load vars for managed_node3 24971 1727096418.45929: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096418.45933: Calling groups_plugins_play to load vars for managed_node3 24971 1727096418.46237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096418.46462: done with get_vars() 24971 1727096418.46477: done getting variables 24971 1727096418.46607: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:00:18 -0400 (0:00:00.457) 0:00:05.944 ****** 24971 1727096418.46637: entering _queue_task() for managed_node3/set_fact 24971 1727096418.47277: worker is 1 (out of 1 available) 24971 1727096418.47291: exiting _queue_task() for managed_node3/set_fact 24971 1727096418.47317: done queuing things up, now waiting for results queue to drain 24971 1727096418.47319: waiting for pending results... 24971 1727096418.47974: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 24971 1727096418.47979: in run() - task 0afff68d-5257-3482-6844-0000000001b7 24971 1727096418.47983: variable 'ansible_search_path' from source: unknown 24971 1727096418.47985: variable 'ansible_search_path' from source: unknown 24971 1727096418.47988: calling self._execute() 24971 1727096418.48124: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.48159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.48177: variable 'omit' from source: magic vars 24971 1727096418.48541: variable 'ansible_distribution_major_version' from source: facts 24971 1727096418.48553: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096418.48560: variable 'omit' from source: magic vars 24971 1727096418.48618: variable 'omit' from source: magic vars 24971 1727096418.48722: variable '_current_interfaces' from source: set_fact 24971 1727096418.48782: variable 'omit' from source: magic vars 24971 1727096418.48830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096418.48876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096418.48879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096418.48894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.48905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.48938: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096418.48941: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.48944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.49034: Set connection var ansible_shell_type to sh 24971 1727096418.49045: Set connection var ansible_shell_executable to /bin/sh 24971 1727096418.49052: Set connection var ansible_timeout to 10 24971 1727096418.49058: Set connection var ansible_connection to ssh 24971 1727096418.49063: Set connection var ansible_pipelining to False 24971 1727096418.49070: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096418.49094: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.49098: variable 'ansible_connection' from source: unknown 24971 1727096418.49100: variable 'ansible_module_compression' from source: unknown 24971 1727096418.49103: variable 'ansible_shell_type' from source: unknown 24971 1727096418.49105: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.49107: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.49112: variable 'ansible_pipelining' from source: unknown 24971 1727096418.49114: variable 'ansible_timeout' from source: unknown 24971 1727096418.49118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.49249: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096418.49306: variable 'omit' from source: magic vars 24971 1727096418.49313: starting attempt loop 24971 1727096418.49316: running the handler 24971 1727096418.49318: handler run complete 24971 1727096418.49321: attempt loop complete, returning result 24971 1727096418.49323: _execute() done 24971 1727096418.49325: dumping result to json 24971 1727096418.49328: done dumping result, returning 24971 1727096418.49330: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-3482-6844-0000000001b7] 24971 1727096418.49332: sending task result for task 0afff68d-5257-3482-6844-0000000001b7 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24971 1727096418.49530: no more pending results, returning what we have 24971 1727096418.49532: results queue empty 24971 1727096418.49533: checking for any_errors_fatal 24971 1727096418.49539: done checking for any_errors_fatal 24971 1727096418.49540: checking for max_fail_percentage 24971 1727096418.49541: done checking for max_fail_percentage 24971 1727096418.49542: checking to see if all hosts have failed and the running result is not ok 24971 1727096418.49542: done checking to see if all hosts have failed 24971 1727096418.49543: getting the remaining hosts for this loop 24971 1727096418.49544: done getting the remaining hosts for this loop 24971 1727096418.49547: getting the next task for host managed_node3 24971 1727096418.49554: done getting next task for host managed_node3 24971 1727096418.49556: ^ task is: TASK: Show current_interfaces 24971 1727096418.49560: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096418.49563: getting variables 24971 1727096418.49564: in VariableManager get_vars() 24971 1727096418.49600: Calling all_inventory to load vars for managed_node3 24971 1727096418.49603: Calling groups_inventory to load vars for managed_node3 24971 1727096418.49605: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096418.49614: Calling all_plugins_play to load vars for managed_node3 24971 1727096418.49617: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096418.49620: Calling groups_plugins_play to load vars for managed_node3 24971 1727096418.49787: done sending task result for task 0afff68d-5257-3482-6844-0000000001b7 24971 1727096418.49790: WORKER PROCESS EXITING 24971 1727096418.49811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096418.50089: done with get_vars() 24971 1727096418.50099: done getting variables 24971 1727096418.50160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:00:18 -0400 (0:00:00.035) 0:00:05.979 ****** 24971 1727096418.50192: entering _queue_task() for managed_node3/debug 24971 1727096418.50579: worker is 1 (out of 1 available) 24971 1727096418.50588: exiting _queue_task() for managed_node3/debug 24971 1727096418.50599: done queuing things up, now waiting for results queue to drain 24971 1727096418.50600: waiting for pending results... 24971 1727096418.50765: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 24971 1727096418.50865: in run() - task 0afff68d-5257-3482-6844-000000000180 24971 1727096418.50983: variable 'ansible_search_path' from source: unknown 24971 1727096418.50987: variable 'ansible_search_path' from source: unknown 24971 1727096418.50990: calling self._execute() 24971 1727096418.51015: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.51025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.51038: variable 'omit' from source: magic vars 24971 1727096418.51416: variable 'ansible_distribution_major_version' from source: facts 24971 1727096418.51438: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096418.51450: variable 'omit' from source: magic vars 24971 1727096418.51499: variable 'omit' from source: magic vars 24971 1727096418.51604: variable 'current_interfaces' from source: set_fact 24971 1727096418.51645: variable 'omit' from source: magic vars 24971 1727096418.51687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096418.51731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096418.51853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096418.51858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.51861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.51863: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096418.51865: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.51869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.51940: Set connection var ansible_shell_type to sh 24971 1727096418.51953: Set connection var ansible_shell_executable to /bin/sh 24971 1727096418.51974: Set connection var ansible_timeout to 10 24971 1727096418.52073: Set connection var ansible_connection to ssh 24971 1727096418.52076: Set connection var ansible_pipelining to False 24971 1727096418.52078: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096418.52080: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.52083: variable 'ansible_connection' from source: unknown 24971 1727096418.52084: variable 'ansible_module_compression' from source: unknown 24971 1727096418.52086: variable 'ansible_shell_type' from source: unknown 24971 1727096418.52090: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.52091: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.52093: variable 'ansible_pipelining' from source: unknown 24971 1727096418.52095: variable 'ansible_timeout' from source: unknown 24971 1727096418.52097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.52213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096418.52274: variable 'omit' from source: magic vars 24971 1727096418.52277: starting attempt loop 24971 1727096418.52280: running the handler 24971 1727096418.52293: handler run complete 24971 1727096418.52311: attempt loop complete, returning result 24971 1727096418.52321: _execute() done 24971 1727096418.52327: dumping result to json 24971 1727096418.52334: done dumping result, returning 24971 1727096418.52345: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-3482-6844-000000000180] 24971 1727096418.52352: sending task result for task 0afff68d-5257-3482-6844-000000000180 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24971 1727096418.52549: no more pending results, returning what we have 24971 1727096418.52553: results queue empty 24971 1727096418.52554: checking for any_errors_fatal 24971 1727096418.52560: done checking for any_errors_fatal 24971 1727096418.52561: checking for max_fail_percentage 24971 1727096418.52563: done checking for max_fail_percentage 24971 1727096418.52563: checking to see if all hosts have failed and the running result is not ok 24971 1727096418.52564: done checking to see if all hosts have failed 24971 1727096418.52565: getting the remaining hosts for this loop 24971 1727096418.52566: done getting the remaining hosts for this loop 24971 1727096418.52574: getting the next task for host managed_node3 24971 1727096418.52583: done getting next task for host managed_node3 24971 1727096418.52586: ^ task is: TASK: Install iproute 24971 1727096418.52589: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096418.52594: getting variables 24971 1727096418.52596: in VariableManager get_vars() 24971 1727096418.52704: Calling all_inventory to load vars for managed_node3 24971 1727096418.52707: Calling groups_inventory to load vars for managed_node3 24971 1727096418.52709: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096418.52878: Calling all_plugins_play to load vars for managed_node3 24971 1727096418.52882: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096418.52885: Calling groups_plugins_play to load vars for managed_node3 24971 1727096418.53064: done sending task result for task 0afff68d-5257-3482-6844-000000000180 24971 1727096418.53069: WORKER PROCESS EXITING 24971 1727096418.53097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096418.53308: done with get_vars() 24971 1727096418.53319: done getting variables 24971 1727096418.53376: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:00:18 -0400 (0:00:00.032) 0:00:06.011 ****** 24971 1727096418.53412: entering _queue_task() for managed_node3/package 24971 1727096418.53681: worker is 1 (out of 1 available) 24971 1727096418.53692: exiting _queue_task() for managed_node3/package 24971 1727096418.53704: done queuing things up, now waiting for results queue to drain 24971 1727096418.53705: waiting for pending results... 24971 1727096418.53949: running TaskExecutor() for managed_node3/TASK: Install iproute 24971 1727096418.54042: in run() - task 0afff68d-5257-3482-6844-000000000159 24971 1727096418.54070: variable 'ansible_search_path' from source: unknown 24971 1727096418.54078: variable 'ansible_search_path' from source: unknown 24971 1727096418.54118: calling self._execute() 24971 1727096418.54208: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.54218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.54232: variable 'omit' from source: magic vars 24971 1727096418.55050: variable 'ansible_distribution_major_version' from source: facts 24971 1727096418.55054: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096418.55056: variable 'omit' from source: magic vars 24971 1727096418.55058: variable 'omit' from source: magic vars 24971 1727096418.55359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096418.60064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096418.60206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096418.60249: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096418.60287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096418.60437: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096418.60606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096418.60640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096418.60681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096418.60721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096418.60744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096418.60872: variable '__network_is_ostree' from source: set_fact 24971 1727096418.60877: variable 'omit' from source: magic vars 24971 1727096418.60993: variable 'omit' from source: magic vars 24971 1727096418.60996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096418.60999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096418.61001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096418.61017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.61035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096418.61071: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096418.61081: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.61089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.61189: Set connection var ansible_shell_type to sh 24971 1727096418.61209: Set connection var ansible_shell_executable to /bin/sh 24971 1727096418.61225: Set connection var ansible_timeout to 10 24971 1727096418.61235: Set connection var ansible_connection to ssh 24971 1727096418.61244: Set connection var ansible_pipelining to False 24971 1727096418.61252: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096418.61281: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.61289: variable 'ansible_connection' from source: unknown 24971 1727096418.61295: variable 'ansible_module_compression' from source: unknown 24971 1727096418.61302: variable 'ansible_shell_type' from source: unknown 24971 1727096418.61317: variable 'ansible_shell_executable' from source: unknown 24971 1727096418.61320: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096418.61427: variable 'ansible_pipelining' from source: unknown 24971 1727096418.61430: variable 'ansible_timeout' from source: unknown 24971 1727096418.61432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096418.61441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096418.61456: variable 'omit' from source: magic vars 24971 1727096418.61466: starting attempt loop 24971 1727096418.61475: running the handler 24971 1727096418.61487: variable 'ansible_facts' from source: unknown 24971 1727096418.61492: variable 'ansible_facts' from source: unknown 24971 1727096418.61528: _low_level_execute_command(): starting 24971 1727096418.61646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096418.62230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096418.62246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096418.62293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.62320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096418.62450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.62587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.62689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.62757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.64412: stdout chunk (state=3): >>>/root <<< 24971 1727096418.64509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.64544: stderr chunk (state=3): >>><<< 24971 1727096418.64547: stdout chunk (state=3): >>><<< 24971 1727096418.64561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.64601: _low_level_execute_command(): starting 24971 1727096418.64605: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965 `" && echo ansible-tmp-1727096418.6457543-25252-166624733922965="` echo /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965 `" ) && sleep 0' 24971 1727096418.65026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.65030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.65033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096418.65035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096418.65037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.65085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.65089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.65093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.65132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.67067: stdout chunk (state=3): >>>ansible-tmp-1727096418.6457543-25252-166624733922965=/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965 <<< 24971 1727096418.67180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.67210: stderr chunk (state=3): >>><<< 24971 1727096418.67213: stdout chunk (state=3): >>><<< 24971 1727096418.67224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096418.6457543-25252-166624733922965=/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.67250: variable 'ansible_module_compression' from source: unknown 24971 1727096418.67306: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 24971 1727096418.67310: ANSIBALLZ: Acquiring lock 24971 1727096418.67314: ANSIBALLZ: Lock acquired: 139839577444416 24971 1727096418.67316: ANSIBALLZ: Creating module 24971 1727096418.80118: ANSIBALLZ: Writing module into payload 24971 1727096418.80254: ANSIBALLZ: Writing module 24971 1727096418.80276: ANSIBALLZ: Renaming module 24971 1727096418.80291: ANSIBALLZ: Done creating module 24971 1727096418.80303: variable 'ansible_facts' from source: unknown 24971 1727096418.80357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py 24971 1727096418.80458: Sending initial data 24971 1727096418.80461: Sent initial data (152 bytes) 24971 1727096418.80920: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.80923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096418.80925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.80928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.80930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.80985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.80988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.80990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.81034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.82702: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096418.82729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096418.82770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp1tgzp1b0 /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py <<< 24971 1727096418.82778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py" <<< 24971 1727096418.82799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp1tgzp1b0" to remote "/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py" <<< 24971 1727096418.82802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py" <<< 24971 1727096418.83405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.83453: stderr chunk (state=3): >>><<< 24971 1727096418.83456: stdout chunk (state=3): >>><<< 24971 1727096418.83501: done transferring module to remote 24971 1727096418.83510: _low_level_execute_command(): starting 24971 1727096418.83515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/ /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py && sleep 0' 24971 1727096418.83972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096418.83976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096418.83978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096418.83980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096418.84011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096418.84049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.84055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.84060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.84092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096418.85976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096418.85980: stdout chunk (state=3): >>><<< 24971 1727096418.85983: stderr chunk (state=3): >>><<< 24971 1727096418.86078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096418.86082: _low_level_execute_command(): starting 24971 1727096418.86085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/AnsiballZ_dnf.py && sleep 0' 24971 1727096418.86789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096418.86814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096418.86834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096418.86909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.28710: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24971 1727096419.32979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096419.32998: stdout chunk (state=3): >>><<< 24971 1727096419.33011: stderr chunk (state=3): >>><<< 24971 1727096419.33033: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096419.33091: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096419.33114: _low_level_execute_command(): starting 24971 1727096419.33173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096418.6457543-25252-166624733922965/ > /dev/null 2>&1 && sleep 0' 24971 1727096419.33770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.33816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.33856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.33888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.35820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.35833: stdout chunk (state=3): >>><<< 24971 1727096419.35852: stderr chunk (state=3): >>><<< 24971 1727096419.36074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.36077: handler run complete 24971 1727096419.36080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096419.36238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096419.36285: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096419.36322: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096419.36357: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096419.36431: variable '__install_status' from source: unknown 24971 1727096419.36457: Evaluated conditional (__install_status is success): True 24971 1727096419.36483: attempt loop complete, returning result 24971 1727096419.36491: _execute() done 24971 1727096419.36497: dumping result to json 24971 1727096419.36508: done dumping result, returning 24971 1727096419.36520: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-3482-6844-000000000159] 24971 1727096419.36528: sending task result for task 0afff68d-5257-3482-6844-000000000159 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24971 1727096419.36736: no more pending results, returning what we have 24971 1727096419.36739: results queue empty 24971 1727096419.36740: checking for any_errors_fatal 24971 1727096419.36746: done checking for any_errors_fatal 24971 1727096419.36747: checking for max_fail_percentage 24971 1727096419.36748: done checking for max_fail_percentage 24971 1727096419.36749: checking to see if all hosts have failed and the running result is not ok 24971 1727096419.36750: done checking to see if all hosts have failed 24971 1727096419.36751: getting the remaining hosts for this loop 24971 1727096419.36752: done getting the remaining hosts for this loop 24971 1727096419.36756: getting the next task for host managed_node3 24971 1727096419.36763: done getting next task for host managed_node3 24971 1727096419.36765: ^ task is: TASK: Create veth interface {{ interface }} 24971 1727096419.36770: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096419.36774: getting variables 24971 1727096419.36776: in VariableManager get_vars() 24971 1727096419.36817: Calling all_inventory to load vars for managed_node3 24971 1727096419.36819: Calling groups_inventory to load vars for managed_node3 24971 1727096419.36822: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096419.36834: Calling all_plugins_play to load vars for managed_node3 24971 1727096419.36837: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096419.36840: Calling groups_plugins_play to load vars for managed_node3 24971 1727096419.37392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096419.37623: done with get_vars() 24971 1727096419.37634: done getting variables 24971 1727096419.37663: done sending task result for task 0afff68d-5257-3482-6844-000000000159 24971 1727096419.37666: WORKER PROCESS EXITING 24971 1727096419.37700: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096419.37815: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:00:19 -0400 (0:00:00.844) 0:00:06.856 ****** 24971 1727096419.37861: entering _queue_task() for managed_node3/command 24971 1727096419.38141: worker is 1 (out of 1 available) 24971 1727096419.38153: exiting _queue_task() for managed_node3/command 24971 1727096419.38166: done queuing things up, now waiting for results queue to drain 24971 1727096419.38168: waiting for pending results... 24971 1727096419.38431: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 24971 1727096419.38542: in run() - task 0afff68d-5257-3482-6844-00000000015a 24971 1727096419.38560: variable 'ansible_search_path' from source: unknown 24971 1727096419.38570: variable 'ansible_search_path' from source: unknown 24971 1727096419.38845: variable 'interface' from source: play vars 24971 1727096419.38935: variable 'interface' from source: play vars 24971 1727096419.39012: variable 'interface' from source: play vars 24971 1727096419.39151: Loaded config def from plugin (lookup/items) 24971 1727096419.39162: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 24971 1727096419.39191: variable 'omit' from source: magic vars 24971 1727096419.39303: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.39315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.39329: variable 'omit' from source: magic vars 24971 1727096419.39536: variable 'ansible_distribution_major_version' from source: facts 24971 1727096419.39548: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096419.39737: variable 'type' from source: play vars 24971 1727096419.39749: variable 'state' from source: include params 24971 1727096419.39759: variable 'interface' from source: play vars 24971 1727096419.39772: variable 'current_interfaces' from source: set_fact 24971 1727096419.39785: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24971 1727096419.39796: variable 'omit' from source: magic vars 24971 1727096419.39849: variable 'omit' from source: magic vars 24971 1727096419.39902: variable 'item' from source: unknown 24971 1727096419.39982: variable 'item' from source: unknown 24971 1727096419.40004: variable 'omit' from source: magic vars 24971 1727096419.40037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096419.40076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096419.40098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096419.40119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096419.40135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096419.40174: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096419.40183: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.40190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.40284: Set connection var ansible_shell_type to sh 24971 1727096419.40296: Set connection var ansible_shell_executable to /bin/sh 24971 1727096419.40310: Set connection var ansible_timeout to 10 24971 1727096419.40319: Set connection var ansible_connection to ssh 24971 1727096419.40328: Set connection var ansible_pipelining to False 24971 1727096419.40336: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096419.40356: variable 'ansible_shell_executable' from source: unknown 24971 1727096419.40363: variable 'ansible_connection' from source: unknown 24971 1727096419.40375: variable 'ansible_module_compression' from source: unknown 24971 1727096419.40381: variable 'ansible_shell_type' from source: unknown 24971 1727096419.40387: variable 'ansible_shell_executable' from source: unknown 24971 1727096419.40392: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.40398: variable 'ansible_pipelining' from source: unknown 24971 1727096419.40404: variable 'ansible_timeout' from source: unknown 24971 1727096419.40410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.40535: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096419.40551: variable 'omit' from source: magic vars 24971 1727096419.40559: starting attempt loop 24971 1727096419.40564: running the handler 24971 1727096419.40589: _low_level_execute_command(): starting 24971 1727096419.40721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096419.41374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096419.41382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.41405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.41423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.41493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.43160: stdout chunk (state=3): >>>/root <<< 24971 1727096419.43293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.43307: stdout chunk (state=3): >>><<< 24971 1727096419.43321: stderr chunk (state=3): >>><<< 24971 1727096419.43347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.43364: _low_level_execute_command(): starting 24971 1727096419.43452: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692 `" && echo ansible-tmp-1727096419.4335322-25289-55248454351692="` echo /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692 `" ) && sleep 0' 24971 1727096419.44027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096419.44040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.44054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.44075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096419.44092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096419.44115: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.44218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.44242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.44304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.46292: stdout chunk (state=3): >>>ansible-tmp-1727096419.4335322-25289-55248454351692=/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692 <<< 24971 1727096419.46377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.46400: stdout chunk (state=3): >>><<< 24971 1727096419.46403: stderr chunk (state=3): >>><<< 24971 1727096419.46419: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096419.4335322-25289-55248454351692=/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.46573: variable 'ansible_module_compression' from source: unknown 24971 1727096419.46576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096419.46578: variable 'ansible_facts' from source: unknown 24971 1727096419.46640: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py 24971 1727096419.46873: Sending initial data 24971 1727096419.46883: Sent initial data (155 bytes) 24971 1727096419.47390: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096419.47404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.47419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.47483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.47541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.47559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.47584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.47645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.49230: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096419.49252: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096419.49305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096419.49365: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpnmewtkn5 /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py <<< 24971 1727096419.49399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py" <<< 24971 1727096419.49417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpnmewtkn5" to remote "/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py" <<< 24971 1727096419.50087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.50234: stderr chunk (state=3): >>><<< 24971 1727096419.50238: stdout chunk (state=3): >>><<< 24971 1727096419.50249: done transferring module to remote 24971 1727096419.50264: _low_level_execute_command(): starting 24971 1727096419.50277: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/ /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py && sleep 0' 24971 1727096419.50998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.51050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.51071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.51101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.51170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.52987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.53010: stderr chunk (state=3): >>><<< 24971 1727096419.53013: stdout chunk (state=3): >>><<< 24971 1727096419.53031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.53034: _low_level_execute_command(): starting 24971 1727096419.53040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/AnsiballZ_command.py && sleep 0' 24971 1727096419.53654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096419.53657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.53659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.53666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096419.53685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096419.53693: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096419.53703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.53717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096419.53725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096419.53731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096419.53739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.53749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.53759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096419.53769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096419.53783: stderr chunk (state=3): >>>debug2: match found <<< 24971 1727096419.53793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.53861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.53919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.53924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.53989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.70505: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 09:00:19.691262", "end": "2024-09-23 09:00:19.696907", "delta": "0:00:00.005645", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096419.73006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096419.73010: stdout chunk (state=3): >>><<< 24971 1727096419.73012: stderr chunk (state=3): >>><<< 24971 1727096419.73015: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 09:00:19.691262", "end": "2024-09-23 09:00:19.696907", "delta": "0:00:00.005645", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096419.73234: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096419.73238: _low_level_execute_command(): starting 24971 1727096419.73240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096419.4335322-25289-55248454351692/ > /dev/null 2>&1 && sleep 0' 24971 1727096419.74730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096419.74876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.74955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.74967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.75076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.75148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.78600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.79074: stderr chunk (state=3): >>><<< 24971 1727096419.79079: stdout chunk (state=3): >>><<< 24971 1727096419.79081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.79083: handler run complete 24971 1727096419.79085: Evaluated conditional (False): False 24971 1727096419.79086: attempt loop complete, returning result 24971 1727096419.79088: variable 'item' from source: unknown 24971 1727096419.79089: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.005645", "end": "2024-09-23 09:00:19.696907", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-23 09:00:19.691262" } 24971 1727096419.79603: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.79607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.79610: variable 'omit' from source: magic vars 24971 1727096419.79927: variable 'ansible_distribution_major_version' from source: facts 24971 1727096419.79962: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096419.80159: variable 'type' from source: play vars 24971 1727096419.80172: variable 'state' from source: include params 24971 1727096419.80185: variable 'interface' from source: play vars 24971 1727096419.80193: variable 'current_interfaces' from source: set_fact 24971 1727096419.80331: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24971 1727096419.80334: variable 'omit' from source: magic vars 24971 1727096419.80336: variable 'omit' from source: magic vars 24971 1727096419.80341: variable 'item' from source: unknown 24971 1727096419.80586: variable 'item' from source: unknown 24971 1727096419.80612: variable 'omit' from source: magic vars 24971 1727096419.80939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096419.80943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096419.80945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096419.80947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096419.80949: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.80951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.80958: Set connection var ansible_shell_type to sh 24971 1727096419.80971: Set connection var ansible_shell_executable to /bin/sh 24971 1727096419.80985: Set connection var ansible_timeout to 10 24971 1727096419.80992: Set connection var ansible_connection to ssh 24971 1727096419.80999: Set connection var ansible_pipelining to False 24971 1727096419.81006: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096419.81026: variable 'ansible_shell_executable' from source: unknown 24971 1727096419.81052: variable 'ansible_connection' from source: unknown 24971 1727096419.81059: variable 'ansible_module_compression' from source: unknown 24971 1727096419.81079: variable 'ansible_shell_type' from source: unknown 24971 1727096419.81086: variable 'ansible_shell_executable' from source: unknown 24971 1727096419.81265: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096419.81274: variable 'ansible_pipelining' from source: unknown 24971 1727096419.81276: variable 'ansible_timeout' from source: unknown 24971 1727096419.81279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096419.81281: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096419.81383: variable 'omit' from source: magic vars 24971 1727096419.81391: starting attempt loop 24971 1727096419.81397: running the handler 24971 1727096419.81408: _low_level_execute_command(): starting 24971 1727096419.81415: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096419.82634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.82649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.82665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.82915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.82984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.83016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.84703: stdout chunk (state=3): >>>/root <<< 24971 1727096419.84992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.85026: stderr chunk (state=3): >>><<< 24971 1727096419.85276: stdout chunk (state=3): >>><<< 24971 1727096419.85282: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.85284: _low_level_execute_command(): starting 24971 1727096419.85287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663 `" && echo ansible-tmp-1727096419.8520293-25289-128430335531663="` echo /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663 `" ) && sleep 0' 24971 1727096419.86429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.86488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.86551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096419.86563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096419.86643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.86647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.86650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.86860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.86990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.88932: stdout chunk (state=3): >>>ansible-tmp-1727096419.8520293-25289-128430335531663=/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663 <<< 24971 1727096419.89777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.89781: stdout chunk (state=3): >>><<< 24971 1727096419.89783: stderr chunk (state=3): >>><<< 24971 1727096419.89785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096419.8520293-25289-128430335531663=/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.89787: variable 'ansible_module_compression' from source: unknown 24971 1727096419.89789: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096419.89791: variable 'ansible_facts' from source: unknown 24971 1727096419.89792: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py 24971 1727096419.90511: Sending initial data 24971 1727096419.90514: Sent initial data (156 bytes) 24971 1727096419.91575: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096419.91601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.91604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096419.91607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096419.91725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.91799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096419.91989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.93503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py" <<< 24971 1727096419.93507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp5x00zi9w /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py <<< 24971 1727096419.93586: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp5x00zi9w" to remote "/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py" <<< 24971 1727096419.94949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.94990: stderr chunk (state=3): >>><<< 24971 1727096419.95006: stdout chunk (state=3): >>><<< 24971 1727096419.95179: done transferring module to remote 24971 1727096419.95183: _low_level_execute_command(): starting 24971 1727096419.95185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/ /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py && sleep 0' 24971 1727096419.96879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096419.97294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096419.97359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096419.99141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096419.99145: stdout chunk (state=3): >>><<< 24971 1727096419.99147: stderr chunk (state=3): >>><<< 24971 1727096419.99242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096419.99246: _low_level_execute_command(): starting 24971 1727096419.99248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/AnsiballZ_command.py && sleep 0' 24971 1727096420.00488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.00631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.00660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.00728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.16274: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 09:00:20.155538", "end": "2024-09-23 09:00:20.159140", "delta": "0:00:00.003602", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096420.18007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096420.18010: stdout chunk (state=3): >>><<< 24971 1727096420.18013: stderr chunk (state=3): >>><<< 24971 1727096420.18160: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 09:00:20.155538", "end": "2024-09-23 09:00:20.159140", "delta": "0:00:00.003602", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096420.18175: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096420.18178: _low_level_execute_command(): starting 24971 1727096420.18180: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096419.8520293-25289-128430335531663/ > /dev/null 2>&1 && sleep 0' 24971 1727096420.19335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.19345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096420.19355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096420.19374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096420.19413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096420.19503: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.19697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.19758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.21593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.21597: stdout chunk (state=3): >>><<< 24971 1727096420.21599: stderr chunk (state=3): >>><<< 24971 1727096420.21627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.21631: handler run complete 24971 1727096420.21654: Evaluated conditional (False): False 24971 1727096420.21846: attempt loop complete, returning result 24971 1727096420.21849: variable 'item' from source: unknown 24971 1727096420.22063: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003602", "end": "2024-09-23 09:00:20.159140", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-23 09:00:20.155538" } 24971 1727096420.22164: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.22373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.22377: variable 'omit' from source: magic vars 24971 1727096420.22547: variable 'ansible_distribution_major_version' from source: facts 24971 1727096420.22615: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096420.22909: variable 'type' from source: play vars 24971 1727096420.23173: variable 'state' from source: include params 24971 1727096420.23176: variable 'interface' from source: play vars 24971 1727096420.23179: variable 'current_interfaces' from source: set_fact 24971 1727096420.23181: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24971 1727096420.23184: variable 'omit' from source: magic vars 24971 1727096420.23186: variable 'omit' from source: magic vars 24971 1727096420.23188: variable 'item' from source: unknown 24971 1727096420.23233: variable 'item' from source: unknown 24971 1727096420.23319: variable 'omit' from source: magic vars 24971 1727096420.23345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096420.23383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096420.23395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096420.23427: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096420.23457: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.23466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.23617: Set connection var ansible_shell_type to sh 24971 1727096420.23680: Set connection var ansible_shell_executable to /bin/sh 24971 1727096420.23694: Set connection var ansible_timeout to 10 24971 1727096420.23704: Set connection var ansible_connection to ssh 24971 1727096420.23745: Set connection var ansible_pipelining to False 24971 1727096420.23756: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096420.23781: variable 'ansible_shell_executable' from source: unknown 24971 1727096420.23813: variable 'ansible_connection' from source: unknown 24971 1727096420.24049: variable 'ansible_module_compression' from source: unknown 24971 1727096420.24051: variable 'ansible_shell_type' from source: unknown 24971 1727096420.24053: variable 'ansible_shell_executable' from source: unknown 24971 1727096420.24055: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.24058: variable 'ansible_pipelining' from source: unknown 24971 1727096420.24060: variable 'ansible_timeout' from source: unknown 24971 1727096420.24062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.24064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096420.24071: variable 'omit' from source: magic vars 24971 1727096420.24080: starting attempt loop 24971 1727096420.24187: running the handler 24971 1727096420.24190: _low_level_execute_command(): starting 24971 1727096420.24193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096420.25453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.25483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096420.25582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.25730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.25786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.27493: stdout chunk (state=3): >>>/root <<< 24971 1727096420.27506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.27769: stderr chunk (state=3): >>><<< 24971 1727096420.27773: stdout chunk (state=3): >>><<< 24971 1727096420.27775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.27777: _low_level_execute_command(): starting 24971 1727096420.27779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389 `" && echo ansible-tmp-1727096420.2769806-25289-275130253434389="` echo /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389 `" ) && sleep 0' 24971 1727096420.28435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.28473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096420.28477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096420.28479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096420.28482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096420.28492: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096420.28658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.28663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096420.28670: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.28673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.28710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.30589: stdout chunk (state=3): >>>ansible-tmp-1727096420.2769806-25289-275130253434389=/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389 <<< 24971 1727096420.30726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.30729: stdout chunk (state=3): >>><<< 24971 1727096420.30736: stderr chunk (state=3): >>><<< 24971 1727096420.30754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096420.2769806-25289-275130253434389=/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.30983: variable 'ansible_module_compression' from source: unknown 24971 1727096420.31021: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096420.31038: variable 'ansible_facts' from source: unknown 24971 1727096420.31115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py 24971 1727096420.31588: Sending initial data 24971 1727096420.31592: Sent initial data (156 bytes) 24971 1727096420.32684: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.32697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096420.32706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096420.32784: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.33062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.33065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.33071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.34600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24971 1727096420.34604: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096420.34631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096420.34852: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmptvvqcbfp /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py <<< 24971 1727096420.34856: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmptvvqcbfp" to remote "/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py" <<< 24971 1727096420.36221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.36241: stderr chunk (state=3): >>><<< 24971 1727096420.36250: stdout chunk (state=3): >>><<< 24971 1727096420.36298: done transferring module to remote 24971 1727096420.36466: _low_level_execute_command(): starting 24971 1727096420.36478: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/ /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py && sleep 0' 24971 1727096420.37582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096420.37645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096420.37657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096420.37716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.37720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.37791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.37894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.39972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.39976: stderr chunk (state=3): >>><<< 24971 1727096420.39978: stdout chunk (state=3): >>><<< 24971 1727096420.39981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.39983: _low_level_execute_command(): starting 24971 1727096420.39985: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/AnsiballZ_command.py && sleep 0' 24971 1727096420.41150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.41200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.41205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.41242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.56866: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 09:00:20.560994", "end": "2024-09-23 09:00:20.564748", "delta": "0:00:00.003754", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096420.58395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.58409: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 24971 1727096420.58461: stderr chunk (state=3): >>><<< 24971 1727096420.58489: stdout chunk (state=3): >>><<< 24971 1727096420.58514: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 09:00:20.560994", "end": "2024-09-23 09:00:20.564748", "delta": "0:00:00.003754", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096420.58551: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096420.58560: _low_level_execute_command(): starting 24971 1727096420.58575: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096420.2769806-25289-275130253434389/ > /dev/null 2>&1 && sleep 0' 24971 1727096420.59201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.59215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096420.59285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.59344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.59365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.59426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.61520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.61529: stdout chunk (state=3): >>><<< 24971 1727096420.61539: stderr chunk (state=3): >>><<< 24971 1727096420.61558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.61573: handler run complete 24971 1727096420.61781: Evaluated conditional (False): False 24971 1727096420.61784: attempt loop complete, returning result 24971 1727096420.61786: variable 'item' from source: unknown 24971 1727096420.61788: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003754", "end": "2024-09-23 09:00:20.564748", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-23 09:00:20.560994" } 24971 1727096420.61887: dumping result to json 24971 1727096420.61891: done dumping result, returning 24971 1727096420.61893: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-3482-6844-00000000015a] 24971 1727096420.61895: sending task result for task 0afff68d-5257-3482-6844-00000000015a 24971 1727096420.62196: done sending task result for task 0afff68d-5257-3482-6844-00000000015a 24971 1727096420.62199: WORKER PROCESS EXITING 24971 1727096420.62265: no more pending results, returning what we have 24971 1727096420.62272: results queue empty 24971 1727096420.62273: checking for any_errors_fatal 24971 1727096420.62278: done checking for any_errors_fatal 24971 1727096420.62279: checking for max_fail_percentage 24971 1727096420.62280: done checking for max_fail_percentage 24971 1727096420.62281: checking to see if all hosts have failed and the running result is not ok 24971 1727096420.62281: done checking to see if all hosts have failed 24971 1727096420.62282: getting the remaining hosts for this loop 24971 1727096420.62284: done getting the remaining hosts for this loop 24971 1727096420.62287: getting the next task for host managed_node3 24971 1727096420.62292: done getting next task for host managed_node3 24971 1727096420.62295: ^ task is: TASK: Set up veth as managed by NetworkManager 24971 1727096420.62298: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096420.62306: getting variables 24971 1727096420.62307: in VariableManager get_vars() 24971 1727096420.62346: Calling all_inventory to load vars for managed_node3 24971 1727096420.62349: Calling groups_inventory to load vars for managed_node3 24971 1727096420.62351: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096420.62363: Calling all_plugins_play to load vars for managed_node3 24971 1727096420.62365: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096420.62475: Calling groups_plugins_play to load vars for managed_node3 24971 1727096420.62707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096420.62911: done with get_vars() 24971 1727096420.62922: done getting variables 24971 1727096420.62982: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:00:20 -0400 (0:00:01.251) 0:00:08.107 ****** 24971 1727096420.63009: entering _queue_task() for managed_node3/command 24971 1727096420.63249: worker is 1 (out of 1 available) 24971 1727096420.63260: exiting _queue_task() for managed_node3/command 24971 1727096420.63378: done queuing things up, now waiting for results queue to drain 24971 1727096420.63380: waiting for pending results... 24971 1727096420.63520: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 24971 1727096420.63616: in run() - task 0afff68d-5257-3482-6844-00000000015b 24971 1727096420.63634: variable 'ansible_search_path' from source: unknown 24971 1727096420.63641: variable 'ansible_search_path' from source: unknown 24971 1727096420.63683: calling self._execute() 24971 1727096420.63763: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.63779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.63874: variable 'omit' from source: magic vars 24971 1727096420.64148: variable 'ansible_distribution_major_version' from source: facts 24971 1727096420.64164: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096420.64335: variable 'type' from source: play vars 24971 1727096420.64346: variable 'state' from source: include params 24971 1727096420.64356: Evaluated conditional (type == 'veth' and state == 'present'): True 24971 1727096420.64366: variable 'omit' from source: magic vars 24971 1727096420.64414: variable 'omit' from source: magic vars 24971 1727096420.64518: variable 'interface' from source: play vars 24971 1727096420.64547: variable 'omit' from source: magic vars 24971 1727096420.64591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096420.64639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096420.64653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096420.64747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096420.64751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096420.64754: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096420.64756: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.64758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.64836: Set connection var ansible_shell_type to sh 24971 1727096420.64851: Set connection var ansible_shell_executable to /bin/sh 24971 1727096420.64873: Set connection var ansible_timeout to 10 24971 1727096420.64884: Set connection var ansible_connection to ssh 24971 1727096420.64970: Set connection var ansible_pipelining to False 24971 1727096420.64974: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096420.64977: variable 'ansible_shell_executable' from source: unknown 24971 1727096420.64979: variable 'ansible_connection' from source: unknown 24971 1727096420.64981: variable 'ansible_module_compression' from source: unknown 24971 1727096420.64983: variable 'ansible_shell_type' from source: unknown 24971 1727096420.64985: variable 'ansible_shell_executable' from source: unknown 24971 1727096420.64987: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096420.64989: variable 'ansible_pipelining' from source: unknown 24971 1727096420.64991: variable 'ansible_timeout' from source: unknown 24971 1727096420.64993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096420.65115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096420.65131: variable 'omit' from source: magic vars 24971 1727096420.65140: starting attempt loop 24971 1727096420.65147: running the handler 24971 1727096420.65167: _low_level_execute_command(): starting 24971 1727096420.65189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096420.65959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096420.66055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.66084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.66180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.67801: stdout chunk (state=3): >>>/root <<< 24971 1727096420.67946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.67974: stdout chunk (state=3): >>><<< 24971 1727096420.67977: stderr chunk (state=3): >>><<< 24971 1727096420.68086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.68089: _low_level_execute_command(): starting 24971 1727096420.68092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125 `" && echo ansible-tmp-1727096420.679978-25354-267317563775125="` echo /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125 `" ) && sleep 0' 24971 1727096420.68646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.68663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096420.68684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096420.68725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.68833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096420.68837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.68857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.68888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.68909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.68990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.70881: stdout chunk (state=3): >>>ansible-tmp-1727096420.679978-25354-267317563775125=/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125 <<< 24971 1727096420.71047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.71051: stdout chunk (state=3): >>><<< 24971 1727096420.71053: stderr chunk (state=3): >>><<< 24971 1727096420.71274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096420.679978-25354-267317563775125=/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.71278: variable 'ansible_module_compression' from source: unknown 24971 1727096420.71281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096420.71283: variable 'ansible_facts' from source: unknown 24971 1727096420.71299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py 24971 1727096420.71497: Sending initial data 24971 1727096420.71500: Sent initial data (155 bytes) 24971 1727096420.72165: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.72176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.72190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.72252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.73848: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096420.73852: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24971 1727096420.73860: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 24971 1727096420.73862: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 24971 1727096420.73865: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 24971 1727096420.73866: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 24971 1727096420.73872: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096420.73904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096420.73929: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpfr43kjgy /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py <<< 24971 1727096420.73961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py" <<< 24971 1727096420.73999: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpfr43kjgy" to remote "/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py" <<< 24971 1727096420.74659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.74673: stderr chunk (state=3): >>><<< 24971 1727096420.74681: stdout chunk (state=3): >>><<< 24971 1727096420.74710: done transferring module to remote 24971 1727096420.74724: _low_level_execute_command(): starting 24971 1727096420.74796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/ /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py && sleep 0' 24971 1727096420.75332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096420.75344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096420.75388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096420.75432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.75496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.75534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.75548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.75602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.77435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096420.77439: stdout chunk (state=3): >>><<< 24971 1727096420.77442: stderr chunk (state=3): >>><<< 24971 1727096420.77543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096420.77546: _low_level_execute_command(): starting 24971 1727096420.77549: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/AnsiballZ_command.py && sleep 0' 24971 1727096420.78119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096420.78185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096420.78233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.78251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096420.78283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.78354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096420.95504: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 09:00:20.931799", "end": "2024-09-23 09:00:20.949276", "delta": "0:00:00.017477", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096420.97135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096420.97139: stdout chunk (state=3): >>><<< 24971 1727096420.97141: stderr chunk (state=3): >>><<< 24971 1727096420.97144: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 09:00:20.931799", "end": "2024-09-23 09:00:20.949276", "delta": "0:00:00.017477", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096420.97146: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096420.97149: _low_level_execute_command(): starting 24971 1727096420.97151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096420.679978-25354-267317563775125/ > /dev/null 2>&1 && sleep 0' 24971 1727096420.98392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096420.98440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096420.98472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.00376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.00380: stdout chunk (state=3): >>><<< 24971 1727096421.00382: stderr chunk (state=3): >>><<< 24971 1727096421.00385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.00387: handler run complete 24971 1727096421.00530: Evaluated conditional (False): False 24971 1727096421.00534: attempt loop complete, returning result 24971 1727096421.00536: _execute() done 24971 1727096421.00539: dumping result to json 24971 1727096421.00541: done dumping result, returning 24971 1727096421.00543: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-3482-6844-00000000015b] 24971 1727096421.00545: sending task result for task 0afff68d-5257-3482-6844-00000000015b ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.017477", "end": "2024-09-23 09:00:20.949276", "rc": 0, "start": "2024-09-23 09:00:20.931799" } 24971 1727096421.00756: no more pending results, returning what we have 24971 1727096421.00760: results queue empty 24971 1727096421.00761: checking for any_errors_fatal 24971 1727096421.00783: done checking for any_errors_fatal 24971 1727096421.00784: checking for max_fail_percentage 24971 1727096421.00786: done checking for max_fail_percentage 24971 1727096421.00787: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.00788: done checking to see if all hosts have failed 24971 1727096421.00789: getting the remaining hosts for this loop 24971 1727096421.00790: done getting the remaining hosts for this loop 24971 1727096421.00794: getting the next task for host managed_node3 24971 1727096421.00800: done getting next task for host managed_node3 24971 1727096421.00803: ^ task is: TASK: Delete veth interface {{ interface }} 24971 1727096421.00807: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.00811: getting variables 24971 1727096421.00813: in VariableManager get_vars() 24971 1727096421.00854: Calling all_inventory to load vars for managed_node3 24971 1727096421.00857: Calling groups_inventory to load vars for managed_node3 24971 1727096421.00860: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.01223: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.01227: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.01232: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.01615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.02117: done with get_vars() 24971 1727096421.02129: done getting variables 24971 1727096421.02160: done sending task result for task 0afff68d-5257-3482-6844-00000000015b 24971 1727096421.02163: WORKER PROCESS EXITING 24971 1727096421.02204: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096421.02424: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:00:21 -0400 (0:00:00.394) 0:00:08.502 ****** 24971 1727096421.02454: entering _queue_task() for managed_node3/command 24971 1727096421.02982: worker is 1 (out of 1 available) 24971 1727096421.02994: exiting _queue_task() for managed_node3/command 24971 1727096421.03005: done queuing things up, now waiting for results queue to drain 24971 1727096421.03005: waiting for pending results... 24971 1727096421.03356: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 24971 1727096421.03453: in run() - task 0afff68d-5257-3482-6844-00000000015c 24971 1727096421.03595: variable 'ansible_search_path' from source: unknown 24971 1727096421.03603: variable 'ansible_search_path' from source: unknown 24971 1727096421.03643: calling self._execute() 24971 1727096421.03864: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.03881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.03895: variable 'omit' from source: magic vars 24971 1727096421.04612: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.04696: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.05116: variable 'type' from source: play vars 24971 1727096421.05128: variable 'state' from source: include params 24971 1727096421.05342: variable 'interface' from source: play vars 24971 1727096421.05346: variable 'current_interfaces' from source: set_fact 24971 1727096421.05349: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 24971 1727096421.05351: when evaluation is False, skipping this task 24971 1727096421.05354: _execute() done 24971 1727096421.05356: dumping result to json 24971 1727096421.05358: done dumping result, returning 24971 1727096421.05360: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-3482-6844-00000000015c] 24971 1727096421.05362: sending task result for task 0afff68d-5257-3482-6844-00000000015c 24971 1727096421.05430: done sending task result for task 0afff68d-5257-3482-6844-00000000015c 24971 1727096421.05433: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096421.05499: no more pending results, returning what we have 24971 1727096421.05503: results queue empty 24971 1727096421.05504: checking for any_errors_fatal 24971 1727096421.05516: done checking for any_errors_fatal 24971 1727096421.05516: checking for max_fail_percentage 24971 1727096421.05518: done checking for max_fail_percentage 24971 1727096421.05519: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.05520: done checking to see if all hosts have failed 24971 1727096421.05521: getting the remaining hosts for this loop 24971 1727096421.05522: done getting the remaining hosts for this loop 24971 1727096421.05526: getting the next task for host managed_node3 24971 1727096421.05531: done getting next task for host managed_node3 24971 1727096421.05534: ^ task is: TASK: Create dummy interface {{ interface }} 24971 1727096421.05538: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.05543: getting variables 24971 1727096421.05544: in VariableManager get_vars() 24971 1727096421.05591: Calling all_inventory to load vars for managed_node3 24971 1727096421.05594: Calling groups_inventory to load vars for managed_node3 24971 1727096421.05596: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.05610: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.05613: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.05616: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.06103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.06554: done with get_vars() 24971 1727096421.06565: done getting variables 24971 1727096421.06627: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096421.06742: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:00:21 -0400 (0:00:00.043) 0:00:08.545 ****** 24971 1727096421.06777: entering _queue_task() for managed_node3/command 24971 1727096421.07028: worker is 1 (out of 1 available) 24971 1727096421.07039: exiting _queue_task() for managed_node3/command 24971 1727096421.07051: done queuing things up, now waiting for results queue to drain 24971 1727096421.07052: waiting for pending results... 24971 1727096421.07397: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 24971 1727096421.07436: in run() - task 0afff68d-5257-3482-6844-00000000015d 24971 1727096421.07459: variable 'ansible_search_path' from source: unknown 24971 1727096421.07471: variable 'ansible_search_path' from source: unknown 24971 1727096421.07522: calling self._execute() 24971 1727096421.07611: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.07675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.07678: variable 'omit' from source: magic vars 24971 1727096421.08010: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.08030: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.08237: variable 'type' from source: play vars 24971 1727096421.08248: variable 'state' from source: include params 24971 1727096421.08262: variable 'interface' from source: play vars 24971 1727096421.08277: variable 'current_interfaces' from source: set_fact 24971 1727096421.08289: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 24971 1727096421.08297: when evaluation is False, skipping this task 24971 1727096421.08372: _execute() done 24971 1727096421.08376: dumping result to json 24971 1727096421.08379: done dumping result, returning 24971 1727096421.08381: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-3482-6844-00000000015d] 24971 1727096421.08384: sending task result for task 0afff68d-5257-3482-6844-00000000015d 24971 1727096421.08444: done sending task result for task 0afff68d-5257-3482-6844-00000000015d 24971 1727096421.08448: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096421.08522: no more pending results, returning what we have 24971 1727096421.08525: results queue empty 24971 1727096421.08527: checking for any_errors_fatal 24971 1727096421.08533: done checking for any_errors_fatal 24971 1727096421.08534: checking for max_fail_percentage 24971 1727096421.08536: done checking for max_fail_percentage 24971 1727096421.08536: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.08537: done checking to see if all hosts have failed 24971 1727096421.08538: getting the remaining hosts for this loop 24971 1727096421.08539: done getting the remaining hosts for this loop 24971 1727096421.08543: getting the next task for host managed_node3 24971 1727096421.08549: done getting next task for host managed_node3 24971 1727096421.08552: ^ task is: TASK: Delete dummy interface {{ interface }} 24971 1727096421.08555: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.08560: getting variables 24971 1727096421.08561: in VariableManager get_vars() 24971 1727096421.08608: Calling all_inventory to load vars for managed_node3 24971 1727096421.08611: Calling groups_inventory to load vars for managed_node3 24971 1727096421.08613: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.08628: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.08631: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.08635: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.09013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.09207: done with get_vars() 24971 1727096421.09216: done getting variables 24971 1727096421.09264: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096421.09365: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:00:21 -0400 (0:00:00.026) 0:00:08.571 ****** 24971 1727096421.09397: entering _queue_task() for managed_node3/command 24971 1727096421.09631: worker is 1 (out of 1 available) 24971 1727096421.09642: exiting _queue_task() for managed_node3/command 24971 1727096421.09654: done queuing things up, now waiting for results queue to drain 24971 1727096421.09655: waiting for pending results... 24971 1727096421.10085: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 24971 1727096421.10090: in run() - task 0afff68d-5257-3482-6844-00000000015e 24971 1727096421.10093: variable 'ansible_search_path' from source: unknown 24971 1727096421.10095: variable 'ansible_search_path' from source: unknown 24971 1727096421.10098: calling self._execute() 24971 1727096421.10136: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.10147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.10161: variable 'omit' from source: magic vars 24971 1727096421.10503: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.10520: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.10724: variable 'type' from source: play vars 24971 1727096421.10733: variable 'state' from source: include params 24971 1727096421.10741: variable 'interface' from source: play vars 24971 1727096421.10749: variable 'current_interfaces' from source: set_fact 24971 1727096421.10764: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 24971 1727096421.10777: when evaluation is False, skipping this task 24971 1727096421.10784: _execute() done 24971 1727096421.10790: dumping result to json 24971 1727096421.10797: done dumping result, returning 24971 1727096421.10806: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-3482-6844-00000000015e] 24971 1727096421.10814: sending task result for task 0afff68d-5257-3482-6844-00000000015e skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096421.11056: no more pending results, returning what we have 24971 1727096421.11059: results queue empty 24971 1727096421.11061: checking for any_errors_fatal 24971 1727096421.11066: done checking for any_errors_fatal 24971 1727096421.11071: checking for max_fail_percentage 24971 1727096421.11073: done checking for max_fail_percentage 24971 1727096421.11074: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.11075: done checking to see if all hosts have failed 24971 1727096421.11075: getting the remaining hosts for this loop 24971 1727096421.11077: done getting the remaining hosts for this loop 24971 1727096421.11081: getting the next task for host managed_node3 24971 1727096421.11087: done getting next task for host managed_node3 24971 1727096421.11090: ^ task is: TASK: Create tap interface {{ interface }} 24971 1727096421.11094: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.11098: getting variables 24971 1727096421.11100: in VariableManager get_vars() 24971 1727096421.11139: Calling all_inventory to load vars for managed_node3 24971 1727096421.11142: Calling groups_inventory to load vars for managed_node3 24971 1727096421.11145: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.11159: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.11162: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.11165: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.11493: done sending task result for task 0afff68d-5257-3482-6844-00000000015e 24971 1727096421.11497: WORKER PROCESS EXITING 24971 1727096421.11510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.11713: done with get_vars() 24971 1727096421.11723: done getting variables 24971 1727096421.11781: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096421.11885: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:00:21 -0400 (0:00:00.025) 0:00:08.596 ****** 24971 1727096421.11912: entering _queue_task() for managed_node3/command 24971 1727096421.12148: worker is 1 (out of 1 available) 24971 1727096421.12161: exiting _queue_task() for managed_node3/command 24971 1727096421.12280: done queuing things up, now waiting for results queue to drain 24971 1727096421.12281: waiting for pending results... 24971 1727096421.12422: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 24971 1727096421.12528: in run() - task 0afff68d-5257-3482-6844-00000000015f 24971 1727096421.12549: variable 'ansible_search_path' from source: unknown 24971 1727096421.12556: variable 'ansible_search_path' from source: unknown 24971 1727096421.12600: calling self._execute() 24971 1727096421.12685: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.12694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.12705: variable 'omit' from source: magic vars 24971 1727096421.13053: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.13078: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.13292: variable 'type' from source: play vars 24971 1727096421.13303: variable 'state' from source: include params 24971 1727096421.13312: variable 'interface' from source: play vars 24971 1727096421.13322: variable 'current_interfaces' from source: set_fact 24971 1727096421.13334: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 24971 1727096421.13341: when evaluation is False, skipping this task 24971 1727096421.13348: _execute() done 24971 1727096421.13355: dumping result to json 24971 1727096421.13362: done dumping result, returning 24971 1727096421.13378: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-3482-6844-00000000015f] 24971 1727096421.13391: sending task result for task 0afff68d-5257-3482-6844-00000000015f 24971 1727096421.13494: done sending task result for task 0afff68d-5257-3482-6844-00000000015f skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096421.13544: no more pending results, returning what we have 24971 1727096421.13548: results queue empty 24971 1727096421.13549: checking for any_errors_fatal 24971 1727096421.13555: done checking for any_errors_fatal 24971 1727096421.13556: checking for max_fail_percentage 24971 1727096421.13557: done checking for max_fail_percentage 24971 1727096421.13558: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.13559: done checking to see if all hosts have failed 24971 1727096421.13560: getting the remaining hosts for this loop 24971 1727096421.13561: done getting the remaining hosts for this loop 24971 1727096421.13565: getting the next task for host managed_node3 24971 1727096421.13575: done getting next task for host managed_node3 24971 1727096421.13578: ^ task is: TASK: Delete tap interface {{ interface }} 24971 1727096421.13581: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.13586: getting variables 24971 1727096421.13587: in VariableManager get_vars() 24971 1727096421.13633: Calling all_inventory to load vars for managed_node3 24971 1727096421.13636: Calling groups_inventory to load vars for managed_node3 24971 1727096421.13638: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.13653: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.13656: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.13659: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.14067: WORKER PROCESS EXITING 24971 1727096421.14094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.14291: done with get_vars() 24971 1727096421.14302: done getting variables 24971 1727096421.14358: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096421.14464: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:00:21 -0400 (0:00:00.025) 0:00:08.622 ****** 24971 1727096421.14498: entering _queue_task() for managed_node3/command 24971 1727096421.14742: worker is 1 (out of 1 available) 24971 1727096421.14754: exiting _queue_task() for managed_node3/command 24971 1727096421.14972: done queuing things up, now waiting for results queue to drain 24971 1727096421.14974: waiting for pending results... 24971 1727096421.15013: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 24971 1727096421.15115: in run() - task 0afff68d-5257-3482-6844-000000000160 24971 1727096421.15132: variable 'ansible_search_path' from source: unknown 24971 1727096421.15138: variable 'ansible_search_path' from source: unknown 24971 1727096421.15178: calling self._execute() 24971 1727096421.15260: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.15276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.15288: variable 'omit' from source: magic vars 24971 1727096421.15630: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.15651: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.15859: variable 'type' from source: play vars 24971 1727096421.15874: variable 'state' from source: include params 24971 1727096421.15883: variable 'interface' from source: play vars 24971 1727096421.15892: variable 'current_interfaces' from source: set_fact 24971 1727096421.15902: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 24971 1727096421.15963: when evaluation is False, skipping this task 24971 1727096421.15966: _execute() done 24971 1727096421.15972: dumping result to json 24971 1727096421.15975: done dumping result, returning 24971 1727096421.15977: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-3482-6844-000000000160] 24971 1727096421.15979: sending task result for task 0afff68d-5257-3482-6844-000000000160 24971 1727096421.16038: done sending task result for task 0afff68d-5257-3482-6844-000000000160 24971 1727096421.16040: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096421.16121: no more pending results, returning what we have 24971 1727096421.16125: results queue empty 24971 1727096421.16126: checking for any_errors_fatal 24971 1727096421.16134: done checking for any_errors_fatal 24971 1727096421.16135: checking for max_fail_percentage 24971 1727096421.16137: done checking for max_fail_percentage 24971 1727096421.16137: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.16138: done checking to see if all hosts have failed 24971 1727096421.16139: getting the remaining hosts for this loop 24971 1727096421.16140: done getting the remaining hosts for this loop 24971 1727096421.16144: getting the next task for host managed_node3 24971 1727096421.16152: done getting next task for host managed_node3 24971 1727096421.16155: ^ task is: TASK: Set up gateway ip on veth peer 24971 1727096421.16158: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.16163: getting variables 24971 1727096421.16164: in VariableManager get_vars() 24971 1727096421.16213: Calling all_inventory to load vars for managed_node3 24971 1727096421.16216: Calling groups_inventory to load vars for managed_node3 24971 1727096421.16219: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.16233: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.16237: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.16240: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.16651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.16849: done with get_vars() 24971 1727096421.16859: done getting variables 24971 1727096421.16945: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Monday 23 September 2024 09:00:21 -0400 (0:00:00.024) 0:00:08.647 ****** 24971 1727096421.16975: entering _queue_task() for managed_node3/shell 24971 1727096421.16976: Creating lock for shell 24971 1727096421.17210: worker is 1 (out of 1 available) 24971 1727096421.17221: exiting _queue_task() for managed_node3/shell 24971 1727096421.17231: done queuing things up, now waiting for results queue to drain 24971 1727096421.17232: waiting for pending results... 24971 1727096421.17585: running TaskExecutor() for managed_node3/TASK: Set up gateway ip on veth peer 24971 1727096421.17590: in run() - task 0afff68d-5257-3482-6844-00000000000d 24971 1727096421.17594: variable 'ansible_search_path' from source: unknown 24971 1727096421.17616: calling self._execute() 24971 1727096421.17698: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.17712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.17726: variable 'omit' from source: magic vars 24971 1727096421.18074: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.18094: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.18105: variable 'omit' from source: magic vars 24971 1727096421.18141: variable 'omit' from source: magic vars 24971 1727096421.18282: variable 'interface' from source: play vars 24971 1727096421.18357: variable 'omit' from source: magic vars 24971 1727096421.18361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096421.18396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096421.18421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096421.18442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.18459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.18502: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096421.18509: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.18515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.18603: Set connection var ansible_shell_type to sh 24971 1727096421.18674: Set connection var ansible_shell_executable to /bin/sh 24971 1727096421.18678: Set connection var ansible_timeout to 10 24971 1727096421.18681: Set connection var ansible_connection to ssh 24971 1727096421.18683: Set connection var ansible_pipelining to False 24971 1727096421.18684: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096421.18686: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.18687: variable 'ansible_connection' from source: unknown 24971 1727096421.18689: variable 'ansible_module_compression' from source: unknown 24971 1727096421.18691: variable 'ansible_shell_type' from source: unknown 24971 1727096421.18692: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.18694: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.18695: variable 'ansible_pipelining' from source: unknown 24971 1727096421.18696: variable 'ansible_timeout' from source: unknown 24971 1727096421.18698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.18875: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096421.18880: variable 'omit' from source: magic vars 24971 1727096421.18882: starting attempt loop 24971 1727096421.18884: running the handler 24971 1727096421.18887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096421.18910: _low_level_execute_command(): starting 24971 1727096421.18975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096421.19801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.19808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.19820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.19848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.19917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.21566: stdout chunk (state=3): >>>/root <<< 24971 1727096421.21712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.21725: stdout chunk (state=3): >>><<< 24971 1727096421.21744: stderr chunk (state=3): >>><<< 24971 1727096421.21781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.21801: _low_level_execute_command(): starting 24971 1727096421.21813: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266 `" && echo ansible-tmp-1727096421.21789-25387-78455832438266="` echo /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266 `" ) && sleep 0' 24971 1727096421.22466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.22488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.22504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.22531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096421.22591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.22666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.22690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.22712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.22796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.24724: stdout chunk (state=3): >>>ansible-tmp-1727096421.21789-25387-78455832438266=/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266 <<< 24971 1727096421.24895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.24899: stdout chunk (state=3): >>><<< 24971 1727096421.24902: stderr chunk (state=3): >>><<< 24971 1727096421.24929: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096421.21789-25387-78455832438266=/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.24966: variable 'ansible_module_compression' from source: unknown 24971 1727096421.25074: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096421.25082: variable 'ansible_facts' from source: unknown 24971 1727096421.25180: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py 24971 1727096421.25396: Sending initial data 24971 1727096421.25402: Sent initial data (153 bytes) 24971 1727096421.26017: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.26031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.26045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.26074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096421.26093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096421.26201: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.26218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.26286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.27877: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096421.27978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096421.28011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmphzdtf2l9 /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py <<< 24971 1727096421.28047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py" <<< 24971 1727096421.28081: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmphzdtf2l9" to remote "/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py" <<< 24971 1727096421.28756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.28802: stderr chunk (state=3): >>><<< 24971 1727096421.28805: stdout chunk (state=3): >>><<< 24971 1727096421.28825: done transferring module to remote 24971 1727096421.28834: _low_level_execute_command(): starting 24971 1727096421.28839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/ /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py && sleep 0' 24971 1727096421.29248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.29288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096421.29291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.29294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096421.29300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.29302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.29338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.29341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.29379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.31208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.31234: stderr chunk (state=3): >>><<< 24971 1727096421.31237: stdout chunk (state=3): >>><<< 24971 1727096421.31248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.31251: _low_level_execute_command(): starting 24971 1727096421.31257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/AnsiballZ_command.py && sleep 0' 24971 1727096421.31699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.31704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.31706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.31773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.31777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.31780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.31860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.49489: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-23 09:00:21.465490", "end": "2024-09-23 09:00:21.491635", "delta": "0:00:00.026145", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096421.51183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096421.51187: stdout chunk (state=3): >>><<< 24971 1727096421.51190: stderr chunk (state=3): >>><<< 24971 1727096421.51212: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-23 09:00:21.465490", "end": "2024-09-23 09:00:21.491635", "delta": "0:00:00.026145", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096421.51253: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096421.51288: _low_level_execute_command(): starting 24971 1727096421.51359: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096421.21789-25387-78455832438266/ > /dev/null 2>&1 && sleep 0' 24971 1727096421.51934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.51952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.51964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.51984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096421.52079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.52109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.52157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.54023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.54033: stdout chunk (state=3): >>><<< 24971 1727096421.54044: stderr chunk (state=3): >>><<< 24971 1727096421.54063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.54078: handler run complete 24971 1727096421.54274: Evaluated conditional (False): False 24971 1727096421.54277: attempt loop complete, returning result 24971 1727096421.54280: _execute() done 24971 1727096421.54282: dumping result to json 24971 1727096421.54284: done dumping result, returning 24971 1727096421.54286: done running TaskExecutor() for managed_node3/TASK: Set up gateway ip on veth peer [0afff68d-5257-3482-6844-00000000000d] 24971 1727096421.54288: sending task result for task 0afff68d-5257-3482-6844-00000000000d 24971 1727096421.54358: done sending task result for task 0afff68d-5257-3482-6844-00000000000d 24971 1727096421.54361: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.026145", "end": "2024-09-23 09:00:21.491635", "rc": 0, "start": "2024-09-23 09:00:21.465490" } 24971 1727096421.54434: no more pending results, returning what we have 24971 1727096421.54437: results queue empty 24971 1727096421.54438: checking for any_errors_fatal 24971 1727096421.54442: done checking for any_errors_fatal 24971 1727096421.54442: checking for max_fail_percentage 24971 1727096421.54445: done checking for max_fail_percentage 24971 1727096421.54445: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.54446: done checking to see if all hosts have failed 24971 1727096421.54447: getting the remaining hosts for this loop 24971 1727096421.54448: done getting the remaining hosts for this loop 24971 1727096421.54452: getting the next task for host managed_node3 24971 1727096421.54459: done getting next task for host managed_node3 24971 1727096421.54462: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 24971 1727096421.54464: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.54469: getting variables 24971 1727096421.54471: in VariableManager get_vars() 24971 1727096421.54514: Calling all_inventory to load vars for managed_node3 24971 1727096421.54516: Calling groups_inventory to load vars for managed_node3 24971 1727096421.54519: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.54530: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.54533: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.54536: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.54924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.55157: done with get_vars() 24971 1727096421.55172: done getting variables 24971 1727096421.55236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Monday 23 September 2024 09:00:21 -0400 (0:00:00.382) 0:00:09.030 ****** 24971 1727096421.55265: entering _queue_task() for managed_node3/debug 24971 1727096421.55521: worker is 1 (out of 1 available) 24971 1727096421.55532: exiting _queue_task() for managed_node3/debug 24971 1727096421.55543: done queuing things up, now waiting for results queue to drain 24971 1727096421.55544: waiting for pending results... 24971 1727096421.55896: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with static ipv6 config 24971 1727096421.55900: in run() - task 0afff68d-5257-3482-6844-00000000000f 24971 1727096421.55903: variable 'ansible_search_path' from source: unknown 24971 1727096421.55973: calling self._execute() 24971 1727096421.56028: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.56038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.56102: variable 'omit' from source: magic vars 24971 1727096421.56429: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.56449: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.56460: variable 'omit' from source: magic vars 24971 1727096421.56487: variable 'omit' from source: magic vars 24971 1727096421.56537: variable 'omit' from source: magic vars 24971 1727096421.56578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096421.56617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096421.56648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096421.56683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.56691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.56755: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096421.56758: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.56761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.56847: Set connection var ansible_shell_type to sh 24971 1727096421.56871: Set connection var ansible_shell_executable to /bin/sh 24971 1727096421.56890: Set connection var ansible_timeout to 10 24971 1727096421.56975: Set connection var ansible_connection to ssh 24971 1727096421.56978: Set connection var ansible_pipelining to False 24971 1727096421.56981: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096421.56983: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.57028: variable 'ansible_connection' from source: unknown 24971 1727096421.57039: variable 'ansible_module_compression' from source: unknown 24971 1727096421.57048: variable 'ansible_shell_type' from source: unknown 24971 1727096421.57055: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.57061: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.57071: variable 'ansible_pipelining' from source: unknown 24971 1727096421.57086: variable 'ansible_timeout' from source: unknown 24971 1727096421.57093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.57242: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096421.57256: variable 'omit' from source: magic vars 24971 1727096421.57265: starting attempt loop 24971 1727096421.57274: running the handler 24971 1727096421.57333: handler run complete 24971 1727096421.57408: attempt loop complete, returning result 24971 1727096421.57412: _execute() done 24971 1727096421.57414: dumping result to json 24971 1727096421.57416: done dumping result, returning 24971 1727096421.57419: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with static ipv6 config [0afff68d-5257-3482-6844-00000000000f] 24971 1727096421.57421: sending task result for task 0afff68d-5257-3482-6844-00000000000f ok: [managed_node3] => {} MSG: ################################################## 24971 1727096421.57579: no more pending results, returning what we have 24971 1727096421.57583: results queue empty 24971 1727096421.57584: checking for any_errors_fatal 24971 1727096421.57591: done checking for any_errors_fatal 24971 1727096421.57592: checking for max_fail_percentage 24971 1727096421.57594: done checking for max_fail_percentage 24971 1727096421.57594: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.57595: done checking to see if all hosts have failed 24971 1727096421.57596: getting the remaining hosts for this loop 24971 1727096421.57597: done getting the remaining hosts for this loop 24971 1727096421.57601: getting the next task for host managed_node3 24971 1727096421.57607: done getting next task for host managed_node3 24971 1727096421.57613: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24971 1727096421.57616: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.57748: getting variables 24971 1727096421.57750: in VariableManager get_vars() 24971 1727096421.57786: Calling all_inventory to load vars for managed_node3 24971 1727096421.57789: Calling groups_inventory to load vars for managed_node3 24971 1727096421.57791: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.57800: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.57803: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.57806: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.58094: done sending task result for task 0afff68d-5257-3482-6844-00000000000f 24971 1727096421.58097: WORKER PROCESS EXITING 24971 1727096421.58118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.58336: done with get_vars() 24971 1727096421.58345: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:00:21 -0400 (0:00:00.031) 0:00:09.062 ****** 24971 1727096421.58442: entering _queue_task() for managed_node3/include_tasks 24971 1727096421.58872: worker is 1 (out of 1 available) 24971 1727096421.58882: exiting _queue_task() for managed_node3/include_tasks 24971 1727096421.58891: done queuing things up, now waiting for results queue to drain 24971 1727096421.58892: waiting for pending results... 24971 1727096421.58954: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24971 1727096421.59078: in run() - task 0afff68d-5257-3482-6844-000000000017 24971 1727096421.59097: variable 'ansible_search_path' from source: unknown 24971 1727096421.59105: variable 'ansible_search_path' from source: unknown 24971 1727096421.59151: calling self._execute() 24971 1727096421.59234: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.59246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.59259: variable 'omit' from source: magic vars 24971 1727096421.59609: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.59627: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.59637: _execute() done 24971 1727096421.59645: dumping result to json 24971 1727096421.59653: done dumping result, returning 24971 1727096421.59678: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-3482-6844-000000000017] 24971 1727096421.59777: sending task result for task 0afff68d-5257-3482-6844-000000000017 24971 1727096421.59842: done sending task result for task 0afff68d-5257-3482-6844-000000000017 24971 1727096421.59846: WORKER PROCESS EXITING 24971 1727096421.59914: no more pending results, returning what we have 24971 1727096421.59921: in VariableManager get_vars() 24971 1727096421.59970: Calling all_inventory to load vars for managed_node3 24971 1727096421.59973: Calling groups_inventory to load vars for managed_node3 24971 1727096421.59976: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.59988: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.59991: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.59994: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.60283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.60477: done with get_vars() 24971 1727096421.60486: variable 'ansible_search_path' from source: unknown 24971 1727096421.60487: variable 'ansible_search_path' from source: unknown 24971 1727096421.60529: we have included files to process 24971 1727096421.60530: generating all_blocks data 24971 1727096421.60532: done generating all_blocks data 24971 1727096421.60539: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096421.60540: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096421.60542: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096421.61261: done processing included file 24971 1727096421.61273: iterating over new_blocks loaded from include file 24971 1727096421.61275: in VariableManager get_vars() 24971 1727096421.61300: done with get_vars() 24971 1727096421.61302: filtering new block on tags 24971 1727096421.61321: done filtering new block on tags 24971 1727096421.61324: in VariableManager get_vars() 24971 1727096421.61346: done with get_vars() 24971 1727096421.61348: filtering new block on tags 24971 1727096421.61376: done filtering new block on tags 24971 1727096421.61378: in VariableManager get_vars() 24971 1727096421.61393: done with get_vars() 24971 1727096421.61394: filtering new block on tags 24971 1727096421.61404: done filtering new block on tags 24971 1727096421.61405: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 24971 1727096421.61409: extending task lists for all hosts with included blocks 24971 1727096421.61886: done extending task lists 24971 1727096421.61887: done processing included files 24971 1727096421.61887: results queue empty 24971 1727096421.61888: checking for any_errors_fatal 24971 1727096421.61889: done checking for any_errors_fatal 24971 1727096421.61890: checking for max_fail_percentage 24971 1727096421.61891: done checking for max_fail_percentage 24971 1727096421.61891: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.61891: done checking to see if all hosts have failed 24971 1727096421.61892: getting the remaining hosts for this loop 24971 1727096421.61893: done getting the remaining hosts for this loop 24971 1727096421.61894: getting the next task for host managed_node3 24971 1727096421.61897: done getting next task for host managed_node3 24971 1727096421.61899: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24971 1727096421.61901: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.61907: getting variables 24971 1727096421.61907: in VariableManager get_vars() 24971 1727096421.61919: Calling all_inventory to load vars for managed_node3 24971 1727096421.61921: Calling groups_inventory to load vars for managed_node3 24971 1727096421.61922: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.61926: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.61927: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.61929: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.62165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.62281: done with get_vars() 24971 1727096421.62287: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:00:21 -0400 (0:00:00.038) 0:00:09.101 ****** 24971 1727096421.62332: entering _queue_task() for managed_node3/setup 24971 1727096421.62540: worker is 1 (out of 1 available) 24971 1727096421.62554: exiting _queue_task() for managed_node3/setup 24971 1727096421.62565: done queuing things up, now waiting for results queue to drain 24971 1727096421.62566: waiting for pending results... 24971 1727096421.62727: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24971 1727096421.62814: in run() - task 0afff68d-5257-3482-6844-0000000001fc 24971 1727096421.62824: variable 'ansible_search_path' from source: unknown 24971 1727096421.62827: variable 'ansible_search_path' from source: unknown 24971 1727096421.62854: calling self._execute() 24971 1727096421.62914: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.62920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.62928: variable 'omit' from source: magic vars 24971 1727096421.63247: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.63375: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.63575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096421.65133: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096421.65185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096421.65216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096421.65241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096421.65260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096421.65321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096421.65342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096421.65359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096421.65389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096421.65399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096421.65438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096421.65455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096421.65476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096421.65500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096421.65510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096421.65618: variable '__network_required_facts' from source: role '' defaults 24971 1727096421.65625: variable 'ansible_facts' from source: unknown 24971 1727096421.65689: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24971 1727096421.65693: when evaluation is False, skipping this task 24971 1727096421.65696: _execute() done 24971 1727096421.65698: dumping result to json 24971 1727096421.65700: done dumping result, returning 24971 1727096421.65706: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-3482-6844-0000000001fc] 24971 1727096421.65711: sending task result for task 0afff68d-5257-3482-6844-0000000001fc skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096421.65835: no more pending results, returning what we have 24971 1727096421.65839: results queue empty 24971 1727096421.65839: checking for any_errors_fatal 24971 1727096421.65841: done checking for any_errors_fatal 24971 1727096421.65841: checking for max_fail_percentage 24971 1727096421.65843: done checking for max_fail_percentage 24971 1727096421.65844: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.65846: done checking to see if all hosts have failed 24971 1727096421.65846: getting the remaining hosts for this loop 24971 1727096421.65848: done getting the remaining hosts for this loop 24971 1727096421.65851: getting the next task for host managed_node3 24971 1727096421.65860: done getting next task for host managed_node3 24971 1727096421.65862: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24971 1727096421.65866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.65882: getting variables 24971 1727096421.65883: in VariableManager get_vars() 24971 1727096421.65922: Calling all_inventory to load vars for managed_node3 24971 1727096421.65924: Calling groups_inventory to load vars for managed_node3 24971 1727096421.65927: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.65935: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.65937: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.65940: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.66080: done sending task result for task 0afff68d-5257-3482-6844-0000000001fc 24971 1727096421.66084: WORKER PROCESS EXITING 24971 1727096421.66104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.66285: done with get_vars() 24971 1727096421.66295: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:00:21 -0400 (0:00:00.040) 0:00:09.141 ****** 24971 1727096421.66394: entering _queue_task() for managed_node3/stat 24971 1727096421.66636: worker is 1 (out of 1 available) 24971 1727096421.66649: exiting _queue_task() for managed_node3/stat 24971 1727096421.66664: done queuing things up, now waiting for results queue to drain 24971 1727096421.66665: waiting for pending results... 24971 1727096421.67085: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 24971 1727096421.67090: in run() - task 0afff68d-5257-3482-6844-0000000001fe 24971 1727096421.67099: variable 'ansible_search_path' from source: unknown 24971 1727096421.67102: variable 'ansible_search_path' from source: unknown 24971 1727096421.67138: calling self._execute() 24971 1727096421.67223: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.67227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.67236: variable 'omit' from source: magic vars 24971 1727096421.67629: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.67643: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.67773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096421.68025: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096421.68057: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096421.68100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096421.68124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096421.68189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096421.68206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096421.68224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096421.68241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096421.68306: variable '__network_is_ostree' from source: set_fact 24971 1727096421.68312: Evaluated conditional (not __network_is_ostree is defined): False 24971 1727096421.68315: when evaluation is False, skipping this task 24971 1727096421.68318: _execute() done 24971 1727096421.68320: dumping result to json 24971 1727096421.68322: done dumping result, returning 24971 1727096421.68329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-3482-6844-0000000001fe] 24971 1727096421.68334: sending task result for task 0afff68d-5257-3482-6844-0000000001fe 24971 1727096421.68413: done sending task result for task 0afff68d-5257-3482-6844-0000000001fe 24971 1727096421.68416: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24971 1727096421.68464: no more pending results, returning what we have 24971 1727096421.68471: results queue empty 24971 1727096421.68472: checking for any_errors_fatal 24971 1727096421.68478: done checking for any_errors_fatal 24971 1727096421.68478: checking for max_fail_percentage 24971 1727096421.68480: done checking for max_fail_percentage 24971 1727096421.68481: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.68482: done checking to see if all hosts have failed 24971 1727096421.68483: getting the remaining hosts for this loop 24971 1727096421.68484: done getting the remaining hosts for this loop 24971 1727096421.68487: getting the next task for host managed_node3 24971 1727096421.68493: done getting next task for host managed_node3 24971 1727096421.68496: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24971 1727096421.68499: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.68512: getting variables 24971 1727096421.68513: in VariableManager get_vars() 24971 1727096421.68549: Calling all_inventory to load vars for managed_node3 24971 1727096421.68551: Calling groups_inventory to load vars for managed_node3 24971 1727096421.68553: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.68561: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.68563: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.68565: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.68728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.68848: done with get_vars() 24971 1727096421.68855: done getting variables 24971 1727096421.68896: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:00:21 -0400 (0:00:00.025) 0:00:09.167 ****** 24971 1727096421.68922: entering _queue_task() for managed_node3/set_fact 24971 1727096421.69112: worker is 1 (out of 1 available) 24971 1727096421.69124: exiting _queue_task() for managed_node3/set_fact 24971 1727096421.69136: done queuing things up, now waiting for results queue to drain 24971 1727096421.69137: waiting for pending results... 24971 1727096421.69318: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24971 1727096421.69463: in run() - task 0afff68d-5257-3482-6844-0000000001ff 24971 1727096421.69470: variable 'ansible_search_path' from source: unknown 24971 1727096421.69475: variable 'ansible_search_path' from source: unknown 24971 1727096421.69582: calling self._execute() 24971 1727096421.69586: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.69589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.69594: variable 'omit' from source: magic vars 24971 1727096421.69928: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.69975: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.70177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096421.70394: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096421.70411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096421.70441: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096421.70575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096421.70579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096421.70581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096421.70595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096421.70624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096421.70747: variable '__network_is_ostree' from source: set_fact 24971 1727096421.70761: Evaluated conditional (not __network_is_ostree is defined): False 24971 1727096421.70783: when evaluation is False, skipping this task 24971 1727096421.70794: _execute() done 24971 1727096421.70887: dumping result to json 24971 1727096421.70891: done dumping result, returning 24971 1727096421.70895: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-3482-6844-0000000001ff] 24971 1727096421.70901: sending task result for task 0afff68d-5257-3482-6844-0000000001ff 24971 1727096421.71072: done sending task result for task 0afff68d-5257-3482-6844-0000000001ff skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24971 1727096421.71122: no more pending results, returning what we have 24971 1727096421.71127: results queue empty 24971 1727096421.71128: checking for any_errors_fatal 24971 1727096421.71134: done checking for any_errors_fatal 24971 1727096421.71135: checking for max_fail_percentage 24971 1727096421.71136: done checking for max_fail_percentage 24971 1727096421.71137: checking to see if all hosts have failed and the running result is not ok 24971 1727096421.71138: done checking to see if all hosts have failed 24971 1727096421.71139: getting the remaining hosts for this loop 24971 1727096421.71140: done getting the remaining hosts for this loop 24971 1727096421.71144: getting the next task for host managed_node3 24971 1727096421.71153: done getting next task for host managed_node3 24971 1727096421.71156: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24971 1727096421.71160: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096421.71177: getting variables 24971 1727096421.71179: in VariableManager get_vars() 24971 1727096421.71219: Calling all_inventory to load vars for managed_node3 24971 1727096421.71221: Calling groups_inventory to load vars for managed_node3 24971 1727096421.71224: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096421.71235: Calling all_plugins_play to load vars for managed_node3 24971 1727096421.71238: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096421.71240: Calling groups_plugins_play to load vars for managed_node3 24971 1727096421.71777: WORKER PROCESS EXITING 24971 1727096421.71798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096421.72028: done with get_vars() 24971 1727096421.72039: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:00:21 -0400 (0:00:00.032) 0:00:09.199 ****** 24971 1727096421.72156: entering _queue_task() for managed_node3/service_facts 24971 1727096421.72158: Creating lock for service_facts 24971 1727096421.72398: worker is 1 (out of 1 available) 24971 1727096421.72410: exiting _queue_task() for managed_node3/service_facts 24971 1727096421.72421: done queuing things up, now waiting for results queue to drain 24971 1727096421.72422: waiting for pending results... 24971 1727096421.72582: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 24971 1727096421.72665: in run() - task 0afff68d-5257-3482-6844-000000000201 24971 1727096421.72680: variable 'ansible_search_path' from source: unknown 24971 1727096421.72684: variable 'ansible_search_path' from source: unknown 24971 1727096421.72710: calling self._execute() 24971 1727096421.72835: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.72840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.72843: variable 'omit' from source: magic vars 24971 1727096421.73042: variable 'ansible_distribution_major_version' from source: facts 24971 1727096421.73054: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096421.73059: variable 'omit' from source: magic vars 24971 1727096421.73110: variable 'omit' from source: magic vars 24971 1727096421.73132: variable 'omit' from source: magic vars 24971 1727096421.73164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096421.73196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096421.73211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096421.73283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.73292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096421.73317: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096421.73321: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.73323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.73416: Set connection var ansible_shell_type to sh 24971 1727096421.73424: Set connection var ansible_shell_executable to /bin/sh 24971 1727096421.73453: Set connection var ansible_timeout to 10 24971 1727096421.73455: Set connection var ansible_connection to ssh 24971 1727096421.73474: Set connection var ansible_pipelining to False 24971 1727096421.73476: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096421.73675: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.73678: variable 'ansible_connection' from source: unknown 24971 1727096421.73680: variable 'ansible_module_compression' from source: unknown 24971 1727096421.73682: variable 'ansible_shell_type' from source: unknown 24971 1727096421.73684: variable 'ansible_shell_executable' from source: unknown 24971 1727096421.73686: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096421.73688: variable 'ansible_pipelining' from source: unknown 24971 1727096421.73689: variable 'ansible_timeout' from source: unknown 24971 1727096421.73691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096421.74024: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096421.74275: variable 'omit' from source: magic vars 24971 1727096421.74279: starting attempt loop 24971 1727096421.74282: running the handler 24971 1727096421.74284: _low_level_execute_command(): starting 24971 1727096421.74287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096421.75300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.75331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.75489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.75510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.75587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.77258: stdout chunk (state=3): >>>/root <<< 24971 1727096421.77409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.77427: stdout chunk (state=3): >>><<< 24971 1727096421.77449: stderr chunk (state=3): >>><<< 24971 1727096421.77498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.77520: _low_level_execute_command(): starting 24971 1727096421.77533: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349 `" && echo ansible-tmp-1727096421.7750561-25418-6032202547349="` echo /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349 `" ) && sleep 0' 24971 1727096421.78159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.78201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.78215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.78218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096421.78221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096421.78223: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096421.78234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.78237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096421.78323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096421.78326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096421.78329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.78330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.78333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096421.78334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096421.78336: stderr chunk (state=3): >>>debug2: match found <<< 24971 1727096421.78338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.78377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.78390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.78408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.78490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.80498: stdout chunk (state=3): >>>ansible-tmp-1727096421.7750561-25418-6032202547349=/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349 <<< 24971 1727096421.80585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.80651: stderr chunk (state=3): >>><<< 24971 1727096421.80675: stdout chunk (state=3): >>><<< 24971 1727096421.80698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096421.7750561-25418-6032202547349=/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096421.80875: variable 'ansible_module_compression' from source: unknown 24971 1727096421.80878: ANSIBALLZ: Using lock for service_facts 24971 1727096421.80880: ANSIBALLZ: Acquiring lock 24971 1727096421.80882: ANSIBALLZ: Lock acquired: 139839575494288 24971 1727096421.80883: ANSIBALLZ: Creating module 24971 1727096421.95708: ANSIBALLZ: Writing module into payload 24971 1727096421.95819: ANSIBALLZ: Writing module 24971 1727096421.95858: ANSIBALLZ: Renaming module 24971 1727096421.95875: ANSIBALLZ: Done creating module 24971 1727096421.95895: variable 'ansible_facts' from source: unknown 24971 1727096421.95991: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py 24971 1727096421.96209: Sending initial data 24971 1727096421.96213: Sent initial data (160 bytes) 24971 1727096421.96791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096421.96800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096421.96811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096421.96882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096421.96918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096421.96930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096421.96950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096421.97019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096421.98717: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096421.98726: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096421.98730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096421.98759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpozkn2uep /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py <<< 24971 1727096421.98762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py" <<< 24971 1727096421.98831: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpozkn2uep" to remote "/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py" <<< 24971 1727096421.99691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096421.99878: stderr chunk (state=3): >>><<< 24971 1727096421.99882: stdout chunk (state=3): >>><<< 24971 1727096421.99884: done transferring module to remote 24971 1727096421.99886: _low_level_execute_command(): starting 24971 1727096421.99888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/ /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py && sleep 0' 24971 1727096422.00575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096422.00581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096422.00643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096422.02484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096422.02488: stdout chunk (state=3): >>><<< 24971 1727096422.02495: stderr chunk (state=3): >>><<< 24971 1727096422.02511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096422.02515: _low_level_execute_command(): starting 24971 1727096422.02520: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/AnsiballZ_service_facts.py && sleep 0' 24971 1727096422.03341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096422.03482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096422.03506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096422.03573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.56760: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 24971 1727096423.56831: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24971 1727096423.58325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.58393: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 24971 1727096423.58407: stderr chunk (state=3): >>><<< 24971 1727096423.58417: stdout chunk (state=3): >>><<< 24971 1727096423.58448: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096423.59192: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096423.59196: _low_level_execute_command(): starting 24971 1727096423.59199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096421.7750561-25418-6032202547349/ > /dev/null 2>&1 && sleep 0' 24971 1727096423.59750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096423.59765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.59786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096423.59806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096423.59824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096423.59918: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096423.60193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.60263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.62153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.62164: stdout chunk (state=3): >>><<< 24971 1727096423.62183: stderr chunk (state=3): >>><<< 24971 1727096423.62208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096423.62221: handler run complete 24971 1727096423.62437: variable 'ansible_facts' from source: unknown 24971 1727096423.62615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096423.63117: variable 'ansible_facts' from source: unknown 24971 1727096423.63277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096423.63494: attempt loop complete, returning result 24971 1727096423.63504: _execute() done 24971 1727096423.63511: dumping result to json 24971 1727096423.63582: done dumping result, returning 24971 1727096423.63597: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-3482-6844-000000000201] 24971 1727096423.63607: sending task result for task 0afff68d-5257-3482-6844-000000000201 24971 1727096423.64718: done sending task result for task 0afff68d-5257-3482-6844-000000000201 24971 1727096423.64721: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096423.64832: no more pending results, returning what we have 24971 1727096423.64835: results queue empty 24971 1727096423.64835: checking for any_errors_fatal 24971 1727096423.64838: done checking for any_errors_fatal 24971 1727096423.64838: checking for max_fail_percentage 24971 1727096423.64839: done checking for max_fail_percentage 24971 1727096423.64840: checking to see if all hosts have failed and the running result is not ok 24971 1727096423.64840: done checking to see if all hosts have failed 24971 1727096423.64841: getting the remaining hosts for this loop 24971 1727096423.64842: done getting the remaining hosts for this loop 24971 1727096423.64844: getting the next task for host managed_node3 24971 1727096423.64848: done getting next task for host managed_node3 24971 1727096423.64850: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24971 1727096423.64852: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096423.64858: getting variables 24971 1727096423.64859: in VariableManager get_vars() 24971 1727096423.64891: Calling all_inventory to load vars for managed_node3 24971 1727096423.64893: Calling groups_inventory to load vars for managed_node3 24971 1727096423.64894: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096423.64901: Calling all_plugins_play to load vars for managed_node3 24971 1727096423.64902: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096423.64904: Calling groups_plugins_play to load vars for managed_node3 24971 1727096423.65143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096423.65426: done with get_vars() 24971 1727096423.65436: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:00:23 -0400 (0:00:01.933) 0:00:11.132 ****** 24971 1727096423.65509: entering _queue_task() for managed_node3/package_facts 24971 1727096423.65511: Creating lock for package_facts 24971 1727096423.65725: worker is 1 (out of 1 available) 24971 1727096423.65745: exiting _queue_task() for managed_node3/package_facts 24971 1727096423.65755: done queuing things up, now waiting for results queue to drain 24971 1727096423.65756: waiting for pending results... 24971 1727096423.65913: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 24971 1727096423.66004: in run() - task 0afff68d-5257-3482-6844-000000000202 24971 1727096423.66015: variable 'ansible_search_path' from source: unknown 24971 1727096423.66019: variable 'ansible_search_path' from source: unknown 24971 1727096423.66046: calling self._execute() 24971 1727096423.66109: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096423.66113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096423.66123: variable 'omit' from source: magic vars 24971 1727096423.66378: variable 'ansible_distribution_major_version' from source: facts 24971 1727096423.66390: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096423.66393: variable 'omit' from source: magic vars 24971 1727096423.66442: variable 'omit' from source: magic vars 24971 1727096423.66464: variable 'omit' from source: magic vars 24971 1727096423.66496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096423.66555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096423.66559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096423.66564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096423.66576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096423.66629: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096423.66632: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096423.66634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096423.66882: Set connection var ansible_shell_type to sh 24971 1727096423.66885: Set connection var ansible_shell_executable to /bin/sh 24971 1727096423.66888: Set connection var ansible_timeout to 10 24971 1727096423.66890: Set connection var ansible_connection to ssh 24971 1727096423.66892: Set connection var ansible_pipelining to False 24971 1727096423.66894: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096423.66896: variable 'ansible_shell_executable' from source: unknown 24971 1727096423.66898: variable 'ansible_connection' from source: unknown 24971 1727096423.66901: variable 'ansible_module_compression' from source: unknown 24971 1727096423.66903: variable 'ansible_shell_type' from source: unknown 24971 1727096423.66905: variable 'ansible_shell_executable' from source: unknown 24971 1727096423.66907: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096423.66909: variable 'ansible_pipelining' from source: unknown 24971 1727096423.66911: variable 'ansible_timeout' from source: unknown 24971 1727096423.66913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096423.67076: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096423.67102: variable 'omit' from source: magic vars 24971 1727096423.67110: starting attempt loop 24971 1727096423.67116: running the handler 24971 1727096423.67131: _low_level_execute_command(): starting 24971 1727096423.67142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096423.67790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096423.67805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.67848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096423.67872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.67902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.69527: stdout chunk (state=3): >>>/root <<< 24971 1727096423.69629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.69703: stderr chunk (state=3): >>><<< 24971 1727096423.69710: stdout chunk (state=3): >>><<< 24971 1727096423.69730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096423.69820: _low_level_execute_command(): starting 24971 1727096423.69824: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662 `" && echo ansible-tmp-1727096423.6973746-25501-76491953863662="` echo /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662 `" ) && sleep 0' 24971 1727096423.70340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096423.70356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.70375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096423.70395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096423.70426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.70488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.70542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096423.70569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096423.70586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.70656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.72545: stdout chunk (state=3): >>>ansible-tmp-1727096423.6973746-25501-76491953863662=/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662 <<< 24971 1727096423.72687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.72704: stderr chunk (state=3): >>><<< 24971 1727096423.72707: stdout chunk (state=3): >>><<< 24971 1727096423.72736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096423.6973746-25501-76491953863662=/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096423.72779: variable 'ansible_module_compression' from source: unknown 24971 1727096423.72876: ANSIBALLZ: Using lock for package_facts 24971 1727096423.72879: ANSIBALLZ: Acquiring lock 24971 1727096423.72881: ANSIBALLZ: Lock acquired: 139839575225072 24971 1727096423.72883: ANSIBALLZ: Creating module 24971 1727096423.91753: ANSIBALLZ: Writing module into payload 24971 1727096423.91845: ANSIBALLZ: Writing module 24971 1727096423.91866: ANSIBALLZ: Renaming module 24971 1727096423.91876: ANSIBALLZ: Done creating module 24971 1727096423.91902: variable 'ansible_facts' from source: unknown 24971 1727096423.92020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py 24971 1727096423.92125: Sending initial data 24971 1727096423.92129: Sent initial data (161 bytes) 24971 1727096423.92561: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096423.92598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096423.92601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096423.92604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096423.92607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.92658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096423.92661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096423.92663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.92710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.94312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24971 1727096423.94315: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096423.94343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096423.94378: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpmqu20gre /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py <<< 24971 1727096423.94388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py" <<< 24971 1727096423.94408: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpmqu20gre" to remote "/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py" <<< 24971 1727096423.94412: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py" <<< 24971 1727096423.95378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.95425: stderr chunk (state=3): >>><<< 24971 1727096423.95428: stdout chunk (state=3): >>><<< 24971 1727096423.95449: done transferring module to remote 24971 1727096423.95457: _low_level_execute_command(): starting 24971 1727096423.95462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/ /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py && sleep 0' 24971 1727096423.95904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.95907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096423.95909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.95915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.95917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096423.95919: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.95969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096423.95974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096423.95978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.96008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096423.97743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096423.97771: stderr chunk (state=3): >>><<< 24971 1727096423.97776: stdout chunk (state=3): >>><<< 24971 1727096423.97790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096423.97793: _low_level_execute_command(): starting 24971 1727096423.97797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/AnsiballZ_package_facts.py && sleep 0' 24971 1727096423.98214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.98217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096423.98220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096423.98222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096423.98265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096423.98276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096423.98312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096424.42307: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 24971 1727096424.42327: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 24971 1727096424.42346: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 24971 1727096424.42377: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 24971 1727096424.42411: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 24971 1727096424.42417: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 24971 1727096424.42421: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 24971 1727096424.42453: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 24971 1727096424.42460: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 24971 1727096424.42475: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 24971 1727096424.42486: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24971 1727096424.44240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096424.44274: stderr chunk (state=3): >>><<< 24971 1727096424.44277: stdout chunk (state=3): >>><<< 24971 1727096424.44317: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096424.46033: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096424.46051: _low_level_execute_command(): starting 24971 1727096424.46054: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096423.6973746-25501-76491953863662/ > /dev/null 2>&1 && sleep 0' 24971 1727096424.46516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096424.46519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096424.46521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096424.46524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096424.46526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096424.46584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096424.46591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096424.46593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096424.46625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096424.48426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096424.48453: stderr chunk (state=3): >>><<< 24971 1727096424.48456: stdout chunk (state=3): >>><<< 24971 1727096424.48477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096424.48484: handler run complete 24971 1727096424.48929: variable 'ansible_facts' from source: unknown 24971 1727096424.49214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.50238: variable 'ansible_facts' from source: unknown 24971 1727096424.50469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.50846: attempt loop complete, returning result 24971 1727096424.50858: _execute() done 24971 1727096424.50861: dumping result to json 24971 1727096424.50978: done dumping result, returning 24971 1727096424.50986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-3482-6844-000000000202] 24971 1727096424.50991: sending task result for task 0afff68d-5257-3482-6844-000000000202 24971 1727096424.52309: done sending task result for task 0afff68d-5257-3482-6844-000000000202 24971 1727096424.52312: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096424.52350: no more pending results, returning what we have 24971 1727096424.52352: results queue empty 24971 1727096424.52353: checking for any_errors_fatal 24971 1727096424.52356: done checking for any_errors_fatal 24971 1727096424.52356: checking for max_fail_percentage 24971 1727096424.52357: done checking for max_fail_percentage 24971 1727096424.52358: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.52358: done checking to see if all hosts have failed 24971 1727096424.52359: getting the remaining hosts for this loop 24971 1727096424.52359: done getting the remaining hosts for this loop 24971 1727096424.52362: getting the next task for host managed_node3 24971 1727096424.52366: done getting next task for host managed_node3 24971 1727096424.52372: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24971 1727096424.52374: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.52380: getting variables 24971 1727096424.52381: in VariableManager get_vars() 24971 1727096424.52403: Calling all_inventory to load vars for managed_node3 24971 1727096424.52404: Calling groups_inventory to load vars for managed_node3 24971 1727096424.52408: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.52416: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.52421: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.52424: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.53125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.53981: done with get_vars() 24971 1727096424.53996: done getting variables 24971 1727096424.54041: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:00:24 -0400 (0:00:00.885) 0:00:12.018 ****** 24971 1727096424.54071: entering _queue_task() for managed_node3/debug 24971 1727096424.54287: worker is 1 (out of 1 available) 24971 1727096424.54301: exiting _queue_task() for managed_node3/debug 24971 1727096424.54312: done queuing things up, now waiting for results queue to drain 24971 1727096424.54313: waiting for pending results... 24971 1727096424.54483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 24971 1727096424.54558: in run() - task 0afff68d-5257-3482-6844-000000000018 24971 1727096424.54573: variable 'ansible_search_path' from source: unknown 24971 1727096424.54577: variable 'ansible_search_path' from source: unknown 24971 1727096424.54603: calling self._execute() 24971 1727096424.54679: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.54683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.54692: variable 'omit' from source: magic vars 24971 1727096424.54948: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.54958: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.54964: variable 'omit' from source: magic vars 24971 1727096424.55005: variable 'omit' from source: magic vars 24971 1727096424.55073: variable 'network_provider' from source: set_fact 24971 1727096424.55088: variable 'omit' from source: magic vars 24971 1727096424.55120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096424.55145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096424.55160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096424.55176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096424.55186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096424.55212: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096424.55215: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.55217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.55281: Set connection var ansible_shell_type to sh 24971 1727096424.55288: Set connection var ansible_shell_executable to /bin/sh 24971 1727096424.55298: Set connection var ansible_timeout to 10 24971 1727096424.55301: Set connection var ansible_connection to ssh 24971 1727096424.55313: Set connection var ansible_pipelining to False 24971 1727096424.55316: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096424.55328: variable 'ansible_shell_executable' from source: unknown 24971 1727096424.55332: variable 'ansible_connection' from source: unknown 24971 1727096424.55335: variable 'ansible_module_compression' from source: unknown 24971 1727096424.55337: variable 'ansible_shell_type' from source: unknown 24971 1727096424.55340: variable 'ansible_shell_executable' from source: unknown 24971 1727096424.55342: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.55344: variable 'ansible_pipelining' from source: unknown 24971 1727096424.55346: variable 'ansible_timeout' from source: unknown 24971 1727096424.55349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.55447: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096424.55455: variable 'omit' from source: magic vars 24971 1727096424.55458: starting attempt loop 24971 1727096424.55461: running the handler 24971 1727096424.55516: handler run complete 24971 1727096424.55528: attempt loop complete, returning result 24971 1727096424.55533: _execute() done 24971 1727096424.55536: dumping result to json 24971 1727096424.55539: done dumping result, returning 24971 1727096424.55546: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-3482-6844-000000000018] 24971 1727096424.55549: sending task result for task 0afff68d-5257-3482-6844-000000000018 24971 1727096424.55625: done sending task result for task 0afff68d-5257-3482-6844-000000000018 24971 1727096424.55628: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 24971 1727096424.55686: no more pending results, returning what we have 24971 1727096424.55689: results queue empty 24971 1727096424.55690: checking for any_errors_fatal 24971 1727096424.55700: done checking for any_errors_fatal 24971 1727096424.55700: checking for max_fail_percentage 24971 1727096424.55702: done checking for max_fail_percentage 24971 1727096424.55703: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.55703: done checking to see if all hosts have failed 24971 1727096424.55704: getting the remaining hosts for this loop 24971 1727096424.55705: done getting the remaining hosts for this loop 24971 1727096424.55709: getting the next task for host managed_node3 24971 1727096424.55715: done getting next task for host managed_node3 24971 1727096424.55718: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24971 1727096424.55721: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.55731: getting variables 24971 1727096424.55732: in VariableManager get_vars() 24971 1727096424.55771: Calling all_inventory to load vars for managed_node3 24971 1727096424.55773: Calling groups_inventory to load vars for managed_node3 24971 1727096424.55776: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.55784: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.55786: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.55788: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.56520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.57464: done with get_vars() 24971 1727096424.57484: done getting variables 24971 1727096424.57524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:00:24 -0400 (0:00:00.034) 0:00:12.053 ****** 24971 1727096424.57546: entering _queue_task() for managed_node3/fail 24971 1727096424.57752: worker is 1 (out of 1 available) 24971 1727096424.57766: exiting _queue_task() for managed_node3/fail 24971 1727096424.57780: done queuing things up, now waiting for results queue to drain 24971 1727096424.57781: waiting for pending results... 24971 1727096424.57940: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24971 1727096424.58074: in run() - task 0afff68d-5257-3482-6844-000000000019 24971 1727096424.58077: variable 'ansible_search_path' from source: unknown 24971 1727096424.58079: variable 'ansible_search_path' from source: unknown 24971 1727096424.58081: calling self._execute() 24971 1727096424.58131: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.58135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.58143: variable 'omit' from source: magic vars 24971 1727096424.58397: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.58407: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.58491: variable 'network_state' from source: role '' defaults 24971 1727096424.58499: Evaluated conditional (network_state != {}): False 24971 1727096424.58502: when evaluation is False, skipping this task 24971 1727096424.58505: _execute() done 24971 1727096424.58508: dumping result to json 24971 1727096424.58511: done dumping result, returning 24971 1727096424.58515: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-3482-6844-000000000019] 24971 1727096424.58520: sending task result for task 0afff68d-5257-3482-6844-000000000019 24971 1727096424.58601: done sending task result for task 0afff68d-5257-3482-6844-000000000019 24971 1727096424.58604: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096424.58649: no more pending results, returning what we have 24971 1727096424.58652: results queue empty 24971 1727096424.58653: checking for any_errors_fatal 24971 1727096424.58659: done checking for any_errors_fatal 24971 1727096424.58660: checking for max_fail_percentage 24971 1727096424.58661: done checking for max_fail_percentage 24971 1727096424.58662: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.58663: done checking to see if all hosts have failed 24971 1727096424.58664: getting the remaining hosts for this loop 24971 1727096424.58665: done getting the remaining hosts for this loop 24971 1727096424.58670: getting the next task for host managed_node3 24971 1727096424.58676: done getting next task for host managed_node3 24971 1727096424.58679: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24971 1727096424.58682: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.58695: getting variables 24971 1727096424.58697: in VariableManager get_vars() 24971 1727096424.58736: Calling all_inventory to load vars for managed_node3 24971 1727096424.58738: Calling groups_inventory to load vars for managed_node3 24971 1727096424.58740: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.58748: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.58751: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.58753: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.59476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.60323: done with get_vars() 24971 1727096424.60338: done getting variables 24971 1727096424.60381: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:00:24 -0400 (0:00:00.028) 0:00:12.081 ****** 24971 1727096424.60404: entering _queue_task() for managed_node3/fail 24971 1727096424.60597: worker is 1 (out of 1 available) 24971 1727096424.60610: exiting _queue_task() for managed_node3/fail 24971 1727096424.60621: done queuing things up, now waiting for results queue to drain 24971 1727096424.60622: waiting for pending results... 24971 1727096424.60780: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24971 1727096424.60862: in run() - task 0afff68d-5257-3482-6844-00000000001a 24971 1727096424.60876: variable 'ansible_search_path' from source: unknown 24971 1727096424.60880: variable 'ansible_search_path' from source: unknown 24971 1727096424.60906: calling self._execute() 24971 1727096424.60965: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.60974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.60982: variable 'omit' from source: magic vars 24971 1727096424.61223: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.61233: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.61314: variable 'network_state' from source: role '' defaults 24971 1727096424.61321: Evaluated conditional (network_state != {}): False 24971 1727096424.61324: when evaluation is False, skipping this task 24971 1727096424.61327: _execute() done 24971 1727096424.61330: dumping result to json 24971 1727096424.61332: done dumping result, returning 24971 1727096424.61339: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-3482-6844-00000000001a] 24971 1727096424.61342: sending task result for task 0afff68d-5257-3482-6844-00000000001a 24971 1727096424.61423: done sending task result for task 0afff68d-5257-3482-6844-00000000001a 24971 1727096424.61425: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096424.61474: no more pending results, returning what we have 24971 1727096424.61477: results queue empty 24971 1727096424.61478: checking for any_errors_fatal 24971 1727096424.61486: done checking for any_errors_fatal 24971 1727096424.61486: checking for max_fail_percentage 24971 1727096424.61488: done checking for max_fail_percentage 24971 1727096424.61489: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.61489: done checking to see if all hosts have failed 24971 1727096424.61490: getting the remaining hosts for this loop 24971 1727096424.61491: done getting the remaining hosts for this loop 24971 1727096424.61494: getting the next task for host managed_node3 24971 1727096424.61499: done getting next task for host managed_node3 24971 1727096424.61503: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24971 1727096424.61506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.61518: getting variables 24971 1727096424.61520: in VariableManager get_vars() 24971 1727096424.61553: Calling all_inventory to load vars for managed_node3 24971 1727096424.61556: Calling groups_inventory to load vars for managed_node3 24971 1727096424.61558: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.61566: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.61570: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.61573: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.62388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.63227: done with get_vars() 24971 1727096424.63243: done getting variables 24971 1727096424.63286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:00:24 -0400 (0:00:00.029) 0:00:12.110 ****** 24971 1727096424.63309: entering _queue_task() for managed_node3/fail 24971 1727096424.63519: worker is 1 (out of 1 available) 24971 1727096424.63533: exiting _queue_task() for managed_node3/fail 24971 1727096424.63545: done queuing things up, now waiting for results queue to drain 24971 1727096424.63546: waiting for pending results... 24971 1727096424.63707: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24971 1727096424.63795: in run() - task 0afff68d-5257-3482-6844-00000000001b 24971 1727096424.63805: variable 'ansible_search_path' from source: unknown 24971 1727096424.63808: variable 'ansible_search_path' from source: unknown 24971 1727096424.63836: calling self._execute() 24971 1727096424.63900: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.63905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.63914: variable 'omit' from source: magic vars 24971 1727096424.64160: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.64172: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.64293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096424.65764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096424.65818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096424.65846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096424.65878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096424.65900: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096424.65961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.65985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.66002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.66027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.66038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.66109: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.66121: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24971 1727096424.66200: variable 'ansible_distribution' from source: facts 24971 1727096424.66204: variable '__network_rh_distros' from source: role '' defaults 24971 1727096424.66212: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24971 1727096424.66364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.66387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.66404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.66429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.66439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.66474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.66492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.66510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.66533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.66543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.66574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.66590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.66613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.66634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.66644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.66837: variable 'network_connections' from source: task vars 24971 1727096424.66846: variable 'interface' from source: play vars 24971 1727096424.66898: variable 'interface' from source: play vars 24971 1727096424.66908: variable 'network_state' from source: role '' defaults 24971 1727096424.66956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096424.67071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096424.67105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096424.67128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096424.67156: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096424.67187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096424.67202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096424.67223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.67240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096424.67274: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24971 1727096424.67277: when evaluation is False, skipping this task 24971 1727096424.67280: _execute() done 24971 1727096424.67282: dumping result to json 24971 1727096424.67284: done dumping result, returning 24971 1727096424.67362: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-3482-6844-00000000001b] 24971 1727096424.67366: sending task result for task 0afff68d-5257-3482-6844-00000000001b 24971 1727096424.67426: done sending task result for task 0afff68d-5257-3482-6844-00000000001b 24971 1727096424.67429: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24971 1727096424.67508: no more pending results, returning what we have 24971 1727096424.67511: results queue empty 24971 1727096424.67511: checking for any_errors_fatal 24971 1727096424.67516: done checking for any_errors_fatal 24971 1727096424.67516: checking for max_fail_percentage 24971 1727096424.67518: done checking for max_fail_percentage 24971 1727096424.67519: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.67519: done checking to see if all hosts have failed 24971 1727096424.67520: getting the remaining hosts for this loop 24971 1727096424.67521: done getting the remaining hosts for this loop 24971 1727096424.67524: getting the next task for host managed_node3 24971 1727096424.67529: done getting next task for host managed_node3 24971 1727096424.67533: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24971 1727096424.67535: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.67549: getting variables 24971 1727096424.67550: in VariableManager get_vars() 24971 1727096424.67590: Calling all_inventory to load vars for managed_node3 24971 1727096424.67593: Calling groups_inventory to load vars for managed_node3 24971 1727096424.67595: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.67603: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.67605: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.67607: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.68361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.69212: done with get_vars() 24971 1727096424.69229: done getting variables 24971 1727096424.69300: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:00:24 -0400 (0:00:00.060) 0:00:12.171 ****** 24971 1727096424.69322: entering _queue_task() for managed_node3/dnf 24971 1727096424.69549: worker is 1 (out of 1 available) 24971 1727096424.69562: exiting _queue_task() for managed_node3/dnf 24971 1727096424.69575: done queuing things up, now waiting for results queue to drain 24971 1727096424.69576: waiting for pending results... 24971 1727096424.69741: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24971 1727096424.69828: in run() - task 0afff68d-5257-3482-6844-00000000001c 24971 1727096424.69839: variable 'ansible_search_path' from source: unknown 24971 1727096424.69842: variable 'ansible_search_path' from source: unknown 24971 1727096424.69874: calling self._execute() 24971 1727096424.69938: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.69943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.69951: variable 'omit' from source: magic vars 24971 1727096424.70217: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.70227: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.70364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096424.72074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096424.72121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096424.72148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096424.72186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096424.72209: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096424.72266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.72290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.72312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.72336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.72347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.72427: variable 'ansible_distribution' from source: facts 24971 1727096424.72431: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.72442: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24971 1727096424.72521: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096424.72604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.72620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.72637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.72672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.72682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.72709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.72724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.72740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.72773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.72782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.72808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.72824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.72839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.72871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.72881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.72988: variable 'network_connections' from source: task vars 24971 1727096424.72991: variable 'interface' from source: play vars 24971 1727096424.73032: variable 'interface' from source: play vars 24971 1727096424.73087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096424.73191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096424.73218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096424.73239: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096424.73259: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096424.73299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096424.73311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096424.73332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.73349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096424.73394: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096424.73549: variable 'network_connections' from source: task vars 24971 1727096424.73552: variable 'interface' from source: play vars 24971 1727096424.73598: variable 'interface' from source: play vars 24971 1727096424.73626: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096424.73630: when evaluation is False, skipping this task 24971 1727096424.73632: _execute() done 24971 1727096424.73635: dumping result to json 24971 1727096424.73637: done dumping result, returning 24971 1727096424.73642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-3482-6844-00000000001c] 24971 1727096424.73647: sending task result for task 0afff68d-5257-3482-6844-00000000001c 24971 1727096424.73731: done sending task result for task 0afff68d-5257-3482-6844-00000000001c 24971 1727096424.73733: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096424.73787: no more pending results, returning what we have 24971 1727096424.73790: results queue empty 24971 1727096424.73791: checking for any_errors_fatal 24971 1727096424.73798: done checking for any_errors_fatal 24971 1727096424.73798: checking for max_fail_percentage 24971 1727096424.73800: done checking for max_fail_percentage 24971 1727096424.73801: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.73801: done checking to see if all hosts have failed 24971 1727096424.73802: getting the remaining hosts for this loop 24971 1727096424.73804: done getting the remaining hosts for this loop 24971 1727096424.73807: getting the next task for host managed_node3 24971 1727096424.73813: done getting next task for host managed_node3 24971 1727096424.73816: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24971 1727096424.73819: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.73832: getting variables 24971 1727096424.73834: in VariableManager get_vars() 24971 1727096424.73881: Calling all_inventory to load vars for managed_node3 24971 1727096424.73883: Calling groups_inventory to load vars for managed_node3 24971 1727096424.73885: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.73894: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.73897: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.73900: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.74774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.75623: done with get_vars() 24971 1727096424.75639: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24971 1727096424.75694: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:00:24 -0400 (0:00:00.063) 0:00:12.234 ****** 24971 1727096424.75715: entering _queue_task() for managed_node3/yum 24971 1727096424.75716: Creating lock for yum 24971 1727096424.75940: worker is 1 (out of 1 available) 24971 1727096424.75954: exiting _queue_task() for managed_node3/yum 24971 1727096424.75965: done queuing things up, now waiting for results queue to drain 24971 1727096424.75966: waiting for pending results... 24971 1727096424.76121: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24971 1727096424.76208: in run() - task 0afff68d-5257-3482-6844-00000000001d 24971 1727096424.76220: variable 'ansible_search_path' from source: unknown 24971 1727096424.76223: variable 'ansible_search_path' from source: unknown 24971 1727096424.76251: calling self._execute() 24971 1727096424.76312: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.76316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.76324: variable 'omit' from source: magic vars 24971 1727096424.76574: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.76582: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.76695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096424.78133: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096424.78187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096424.78213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096424.78237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096424.78259: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096424.78324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.78342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.78359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.78394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.78404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.78466: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.78484: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24971 1727096424.78488: when evaluation is False, skipping this task 24971 1727096424.78490: _execute() done 24971 1727096424.78493: dumping result to json 24971 1727096424.78495: done dumping result, returning 24971 1727096424.78502: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-3482-6844-00000000001d] 24971 1727096424.78506: sending task result for task 0afff68d-5257-3482-6844-00000000001d 24971 1727096424.78587: done sending task result for task 0afff68d-5257-3482-6844-00000000001d 24971 1727096424.78590: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24971 1727096424.78637: no more pending results, returning what we have 24971 1727096424.78640: results queue empty 24971 1727096424.78641: checking for any_errors_fatal 24971 1727096424.78648: done checking for any_errors_fatal 24971 1727096424.78649: checking for max_fail_percentage 24971 1727096424.78650: done checking for max_fail_percentage 24971 1727096424.78651: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.78652: done checking to see if all hosts have failed 24971 1727096424.78652: getting the remaining hosts for this loop 24971 1727096424.78654: done getting the remaining hosts for this loop 24971 1727096424.78657: getting the next task for host managed_node3 24971 1727096424.78664: done getting next task for host managed_node3 24971 1727096424.78669: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24971 1727096424.78672: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.78685: getting variables 24971 1727096424.78686: in VariableManager get_vars() 24971 1727096424.78723: Calling all_inventory to load vars for managed_node3 24971 1727096424.78725: Calling groups_inventory to load vars for managed_node3 24971 1727096424.78727: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.78735: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.78737: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.78739: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.79487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.80417: done with get_vars() 24971 1727096424.80432: done getting variables 24971 1727096424.80473: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:00:24 -0400 (0:00:00.047) 0:00:12.282 ****** 24971 1727096424.80495: entering _queue_task() for managed_node3/fail 24971 1727096424.80695: worker is 1 (out of 1 available) 24971 1727096424.80707: exiting _queue_task() for managed_node3/fail 24971 1727096424.80719: done queuing things up, now waiting for results queue to drain 24971 1727096424.80720: waiting for pending results... 24971 1727096424.80880: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24971 1727096424.80958: in run() - task 0afff68d-5257-3482-6844-00000000001e 24971 1727096424.80969: variable 'ansible_search_path' from source: unknown 24971 1727096424.80974: variable 'ansible_search_path' from source: unknown 24971 1727096424.81002: calling self._execute() 24971 1727096424.81066: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.81074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.81083: variable 'omit' from source: magic vars 24971 1727096424.81325: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.81334: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.81417: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096424.81539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096424.82957: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096424.83010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096424.83037: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096424.83063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096424.83087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096424.83145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.83165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.83186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.83211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.83221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.83256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.83275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.83292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.83316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.83326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.83357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.83377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.83394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.83417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.83427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.83545: variable 'network_connections' from source: task vars 24971 1727096424.83556: variable 'interface' from source: play vars 24971 1727096424.83611: variable 'interface' from source: play vars 24971 1727096424.83658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096424.83764: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096424.83796: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096424.83828: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096424.83850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096424.83886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096424.83903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096424.83922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.83940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096424.83987: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096424.84136: variable 'network_connections' from source: task vars 24971 1727096424.84139: variable 'interface' from source: play vars 24971 1727096424.84185: variable 'interface' from source: play vars 24971 1727096424.84209: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096424.84214: when evaluation is False, skipping this task 24971 1727096424.84217: _execute() done 24971 1727096424.84219: dumping result to json 24971 1727096424.84221: done dumping result, returning 24971 1727096424.84232: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-3482-6844-00000000001e] 24971 1727096424.84235: sending task result for task 0afff68d-5257-3482-6844-00000000001e 24971 1727096424.84312: done sending task result for task 0afff68d-5257-3482-6844-00000000001e 24971 1727096424.84314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096424.84384: no more pending results, returning what we have 24971 1727096424.84387: results queue empty 24971 1727096424.84388: checking for any_errors_fatal 24971 1727096424.84395: done checking for any_errors_fatal 24971 1727096424.84395: checking for max_fail_percentage 24971 1727096424.84397: done checking for max_fail_percentage 24971 1727096424.84398: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.84398: done checking to see if all hosts have failed 24971 1727096424.84399: getting the remaining hosts for this loop 24971 1727096424.84401: done getting the remaining hosts for this loop 24971 1727096424.84405: getting the next task for host managed_node3 24971 1727096424.84411: done getting next task for host managed_node3 24971 1727096424.84415: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24971 1727096424.84417: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.84430: getting variables 24971 1727096424.84432: in VariableManager get_vars() 24971 1727096424.84471: Calling all_inventory to load vars for managed_node3 24971 1727096424.84474: Calling groups_inventory to load vars for managed_node3 24971 1727096424.84476: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.84484: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.84487: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.84489: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.85252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096424.86105: done with get_vars() 24971 1727096424.86124: done getting variables 24971 1727096424.86166: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:00:24 -0400 (0:00:00.056) 0:00:12.339 ****** 24971 1727096424.86192: entering _queue_task() for managed_node3/package 24971 1727096424.86435: worker is 1 (out of 1 available) 24971 1727096424.86447: exiting _queue_task() for managed_node3/package 24971 1727096424.86459: done queuing things up, now waiting for results queue to drain 24971 1727096424.86460: waiting for pending results... 24971 1727096424.86630: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 24971 1727096424.86728: in run() - task 0afff68d-5257-3482-6844-00000000001f 24971 1727096424.86738: variable 'ansible_search_path' from source: unknown 24971 1727096424.86742: variable 'ansible_search_path' from source: unknown 24971 1727096424.86772: calling self._execute() 24971 1727096424.86839: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096424.86843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096424.86852: variable 'omit' from source: magic vars 24971 1727096424.87121: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.87132: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096424.87262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096424.87461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096424.87494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096424.87519: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096424.87544: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096424.87627: variable 'network_packages' from source: role '' defaults 24971 1727096424.87703: variable '__network_provider_setup' from source: role '' defaults 24971 1727096424.87711: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096424.87761: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096424.87771: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096424.87818: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096424.87933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096424.89278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096424.89339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096424.89462: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096424.89465: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096424.89470: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096424.89530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.89550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.89571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.89600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.89610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.89641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.89656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.89679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.89704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.89714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.89860: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24971 1727096424.89941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.89958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.89979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.90006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.90017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.90080: variable 'ansible_python' from source: facts 24971 1727096424.90099: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24971 1727096424.90156: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096424.90214: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096424.90298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.90314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.90332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.90360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.90374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.90404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096424.90423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096424.90444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.90476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096424.90482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096424.90575: variable 'network_connections' from source: task vars 24971 1727096424.90582: variable 'interface' from source: play vars 24971 1727096424.90649: variable 'interface' from source: play vars 24971 1727096424.90704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096424.90723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096424.90743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096424.90764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096424.90803: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096424.91172: variable 'network_connections' from source: task vars 24971 1727096424.91175: variable 'interface' from source: play vars 24971 1727096424.91177: variable 'interface' from source: play vars 24971 1727096424.91179: variable '__network_packages_default_wireless' from source: role '' defaults 24971 1727096424.91254: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096424.91559: variable 'network_connections' from source: task vars 24971 1727096424.91572: variable 'interface' from source: play vars 24971 1727096424.91635: variable 'interface' from source: play vars 24971 1727096424.91666: variable '__network_packages_default_team' from source: role '' defaults 24971 1727096424.91748: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096424.92054: variable 'network_connections' from source: task vars 24971 1727096424.92065: variable 'interface' from source: play vars 24971 1727096424.92135: variable 'interface' from source: play vars 24971 1727096424.92205: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096424.92274: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096424.92288: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096424.92352: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096424.92578: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24971 1727096424.93421: variable 'network_connections' from source: task vars 24971 1727096424.93432: variable 'interface' from source: play vars 24971 1727096424.93499: variable 'interface' from source: play vars 24971 1727096424.93516: variable 'ansible_distribution' from source: facts 24971 1727096424.93524: variable '__network_rh_distros' from source: role '' defaults 24971 1727096424.93533: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.93559: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24971 1727096424.93704: variable 'ansible_distribution' from source: facts 24971 1727096424.93713: variable '__network_rh_distros' from source: role '' defaults 24971 1727096424.93722: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.93738: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24971 1727096424.93888: variable 'ansible_distribution' from source: facts 24971 1727096424.93897: variable '__network_rh_distros' from source: role '' defaults 24971 1727096424.93906: variable 'ansible_distribution_major_version' from source: facts 24971 1727096424.94073: variable 'network_provider' from source: set_fact 24971 1727096424.94076: variable 'ansible_facts' from source: unknown 24971 1727096424.98420: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24971 1727096424.98429: when evaluation is False, skipping this task 24971 1727096424.98436: _execute() done 24971 1727096424.98449: dumping result to json 24971 1727096424.98459: done dumping result, returning 24971 1727096424.98462: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-3482-6844-00000000001f] 24971 1727096424.98464: sending task result for task 0afff68d-5257-3482-6844-00000000001f 24971 1727096424.98566: done sending task result for task 0afff68d-5257-3482-6844-00000000001f skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24971 1727096424.98626: no more pending results, returning what we have 24971 1727096424.98629: results queue empty 24971 1727096424.98630: checking for any_errors_fatal 24971 1727096424.98636: done checking for any_errors_fatal 24971 1727096424.98637: checking for max_fail_percentage 24971 1727096424.98638: done checking for max_fail_percentage 24971 1727096424.98639: checking to see if all hosts have failed and the running result is not ok 24971 1727096424.98640: done checking to see if all hosts have failed 24971 1727096424.98641: getting the remaining hosts for this loop 24971 1727096424.98642: done getting the remaining hosts for this loop 24971 1727096424.98646: getting the next task for host managed_node3 24971 1727096424.98652: done getting next task for host managed_node3 24971 1727096424.98655: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24971 1727096424.98658: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096424.98675: getting variables 24971 1727096424.98676: in VariableManager get_vars() 24971 1727096424.98716: Calling all_inventory to load vars for managed_node3 24971 1727096424.98718: Calling groups_inventory to load vars for managed_node3 24971 1727096424.98720: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096424.98729: Calling all_plugins_play to load vars for managed_node3 24971 1727096424.98731: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096424.98733: Calling groups_plugins_play to load vars for managed_node3 24971 1727096424.99287: WORKER PROCESS EXITING 24971 1727096424.99685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096425.04437: done with get_vars() 24971 1727096425.04461: done getting variables 24971 1727096425.04517: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:00:25 -0400 (0:00:00.183) 0:00:12.523 ****** 24971 1727096425.04547: entering _queue_task() for managed_node3/package 24971 1727096425.04887: worker is 1 (out of 1 available) 24971 1727096425.04899: exiting _queue_task() for managed_node3/package 24971 1727096425.04911: done queuing things up, now waiting for results queue to drain 24971 1727096425.04912: waiting for pending results... 24971 1727096425.05377: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24971 1727096425.05382: in run() - task 0afff68d-5257-3482-6844-000000000020 24971 1727096425.05386: variable 'ansible_search_path' from source: unknown 24971 1727096425.05389: variable 'ansible_search_path' from source: unknown 24971 1727096425.05393: calling self._execute() 24971 1727096425.05451: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.05465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.05484: variable 'omit' from source: magic vars 24971 1727096425.05859: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.05881: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096425.06006: variable 'network_state' from source: role '' defaults 24971 1727096425.06021: Evaluated conditional (network_state != {}): False 24971 1727096425.06032: when evaluation is False, skipping this task 24971 1727096425.06041: _execute() done 24971 1727096425.06047: dumping result to json 24971 1727096425.06060: done dumping result, returning 24971 1727096425.06077: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-3482-6844-000000000020] 24971 1727096425.06086: sending task result for task 0afff68d-5257-3482-6844-000000000020 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096425.06245: no more pending results, returning what we have 24971 1727096425.06249: results queue empty 24971 1727096425.06250: checking for any_errors_fatal 24971 1727096425.06260: done checking for any_errors_fatal 24971 1727096425.06261: checking for max_fail_percentage 24971 1727096425.06262: done checking for max_fail_percentage 24971 1727096425.06263: checking to see if all hosts have failed and the running result is not ok 24971 1727096425.06264: done checking to see if all hosts have failed 24971 1727096425.06265: getting the remaining hosts for this loop 24971 1727096425.06266: done getting the remaining hosts for this loop 24971 1727096425.06273: getting the next task for host managed_node3 24971 1727096425.06280: done getting next task for host managed_node3 24971 1727096425.06284: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24971 1727096425.06287: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096425.06305: getting variables 24971 1727096425.06307: in VariableManager get_vars() 24971 1727096425.06348: Calling all_inventory to load vars for managed_node3 24971 1727096425.06350: Calling groups_inventory to load vars for managed_node3 24971 1727096425.06353: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096425.06365: Calling all_plugins_play to load vars for managed_node3 24971 1727096425.06675: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096425.06681: Calling groups_plugins_play to load vars for managed_node3 24971 1727096425.07383: done sending task result for task 0afff68d-5257-3482-6844-000000000020 24971 1727096425.07386: WORKER PROCESS EXITING 24971 1727096425.07930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096425.09508: done with get_vars() 24971 1727096425.09528: done getting variables 24971 1727096425.09587: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:00:25 -0400 (0:00:00.050) 0:00:12.573 ****** 24971 1727096425.09621: entering _queue_task() for managed_node3/package 24971 1727096425.09911: worker is 1 (out of 1 available) 24971 1727096425.09922: exiting _queue_task() for managed_node3/package 24971 1727096425.09936: done queuing things up, now waiting for results queue to drain 24971 1727096425.09937: waiting for pending results... 24971 1727096425.10205: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24971 1727096425.10334: in run() - task 0afff68d-5257-3482-6844-000000000021 24971 1727096425.10351: variable 'ansible_search_path' from source: unknown 24971 1727096425.10357: variable 'ansible_search_path' from source: unknown 24971 1727096425.10403: calling self._execute() 24971 1727096425.10489: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.10506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.10519: variable 'omit' from source: magic vars 24971 1727096425.10923: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.10947: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096425.11079: variable 'network_state' from source: role '' defaults 24971 1727096425.11093: Evaluated conditional (network_state != {}): False 24971 1727096425.11101: when evaluation is False, skipping this task 24971 1727096425.11109: _execute() done 24971 1727096425.11116: dumping result to json 24971 1727096425.11124: done dumping result, returning 24971 1727096425.11135: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-3482-6844-000000000021] 24971 1727096425.11144: sending task result for task 0afff68d-5257-3482-6844-000000000021 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096425.11307: no more pending results, returning what we have 24971 1727096425.11311: results queue empty 24971 1727096425.11312: checking for any_errors_fatal 24971 1727096425.11318: done checking for any_errors_fatal 24971 1727096425.11319: checking for max_fail_percentage 24971 1727096425.11321: done checking for max_fail_percentage 24971 1727096425.11322: checking to see if all hosts have failed and the running result is not ok 24971 1727096425.11323: done checking to see if all hosts have failed 24971 1727096425.11323: getting the remaining hosts for this loop 24971 1727096425.11325: done getting the remaining hosts for this loop 24971 1727096425.11328: getting the next task for host managed_node3 24971 1727096425.11335: done getting next task for host managed_node3 24971 1727096425.11339: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24971 1727096425.11343: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096425.11358: getting variables 24971 1727096425.11360: in VariableManager get_vars() 24971 1727096425.11406: Calling all_inventory to load vars for managed_node3 24971 1727096425.11409: Calling groups_inventory to load vars for managed_node3 24971 1727096425.11412: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096425.11424: Calling all_plugins_play to load vars for managed_node3 24971 1727096425.11427: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096425.11430: Calling groups_plugins_play to load vars for managed_node3 24971 1727096425.12182: done sending task result for task 0afff68d-5257-3482-6844-000000000021 24971 1727096425.12186: WORKER PROCESS EXITING 24971 1727096425.13117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096425.14136: done with get_vars() 24971 1727096425.14151: done getting variables 24971 1727096425.14223: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:00:25 -0400 (0:00:00.046) 0:00:12.620 ****** 24971 1727096425.14246: entering _queue_task() for managed_node3/service 24971 1727096425.14247: Creating lock for service 24971 1727096425.14463: worker is 1 (out of 1 available) 24971 1727096425.14482: exiting _queue_task() for managed_node3/service 24971 1727096425.14493: done queuing things up, now waiting for results queue to drain 24971 1727096425.14494: waiting for pending results... 24971 1727096425.14654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24971 1727096425.14746: in run() - task 0afff68d-5257-3482-6844-000000000022 24971 1727096425.14757: variable 'ansible_search_path' from source: unknown 24971 1727096425.14760: variable 'ansible_search_path' from source: unknown 24971 1727096425.14793: calling self._execute() 24971 1727096425.14861: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.14866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.14877: variable 'omit' from source: magic vars 24971 1727096425.15135: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.15144: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096425.15277: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096425.15440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096425.17136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096425.17187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096425.17216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096425.17242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096425.17261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096425.17323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.17343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.17360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.17389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.17399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.17433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.17450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.17466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.17494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.17504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.17535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.17551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.17567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.17593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.17604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.17715: variable 'network_connections' from source: task vars 24971 1727096425.17725: variable 'interface' from source: play vars 24971 1727096425.17780: variable 'interface' from source: play vars 24971 1727096425.17835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096425.18230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096425.18234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096425.18237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096425.18281: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096425.18366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096425.18397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096425.18428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.18461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096425.18529: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096425.18759: variable 'network_connections' from source: task vars 24971 1727096425.18774: variable 'interface' from source: play vars 24971 1727096425.18838: variable 'interface' from source: play vars 24971 1727096425.18879: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096425.18889: when evaluation is False, skipping this task 24971 1727096425.18898: _execute() done 24971 1727096425.18906: dumping result to json 24971 1727096425.18914: done dumping result, returning 24971 1727096425.18926: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-3482-6844-000000000022] 24971 1727096425.18935: sending task result for task 0afff68d-5257-3482-6844-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096425.19076: done sending task result for task 0afff68d-5257-3482-6844-000000000022 24971 1727096425.19188: no more pending results, returning what we have 24971 1727096425.19369: results queue empty 24971 1727096425.19370: checking for any_errors_fatal 24971 1727096425.19377: done checking for any_errors_fatal 24971 1727096425.19378: checking for max_fail_percentage 24971 1727096425.19379: done checking for max_fail_percentage 24971 1727096425.19380: checking to see if all hosts have failed and the running result is not ok 24971 1727096425.19381: done checking to see if all hosts have failed 24971 1727096425.19382: getting the remaining hosts for this loop 24971 1727096425.19383: done getting the remaining hosts for this loop 24971 1727096425.19387: getting the next task for host managed_node3 24971 1727096425.19392: done getting next task for host managed_node3 24971 1727096425.19396: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24971 1727096425.19398: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096425.19412: getting variables 24971 1727096425.19413: in VariableManager get_vars() 24971 1727096425.19453: Calling all_inventory to load vars for managed_node3 24971 1727096425.19456: Calling groups_inventory to load vars for managed_node3 24971 1727096425.19459: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096425.19469: Calling all_plugins_play to load vars for managed_node3 24971 1727096425.19474: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096425.19477: Calling groups_plugins_play to load vars for managed_node3 24971 1727096425.20108: WORKER PROCESS EXITING 24971 1727096425.21657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096425.23461: done with get_vars() 24971 1727096425.23499: done getting variables 24971 1727096425.23553: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:00:25 -0400 (0:00:00.093) 0:00:12.713 ****** 24971 1727096425.23588: entering _queue_task() for managed_node3/service 24971 1727096425.23935: worker is 1 (out of 1 available) 24971 1727096425.23950: exiting _queue_task() for managed_node3/service 24971 1727096425.23963: done queuing things up, now waiting for results queue to drain 24971 1727096425.23964: waiting for pending results... 24971 1727096425.24207: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24971 1727096425.24477: in run() - task 0afff68d-5257-3482-6844-000000000023 24971 1727096425.24481: variable 'ansible_search_path' from source: unknown 24971 1727096425.24483: variable 'ansible_search_path' from source: unknown 24971 1727096425.24486: calling self._execute() 24971 1727096425.24488: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.24491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.24494: variable 'omit' from source: magic vars 24971 1727096425.24884: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.24900: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096425.25073: variable 'network_provider' from source: set_fact 24971 1727096425.25085: variable 'network_state' from source: role '' defaults 24971 1727096425.25100: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24971 1727096425.25112: variable 'omit' from source: magic vars 24971 1727096425.25234: variable 'omit' from source: magic vars 24971 1727096425.25237: variable 'network_service_name' from source: role '' defaults 24971 1727096425.25299: variable 'network_service_name' from source: role '' defaults 24971 1727096425.25382: variable '__network_provider_setup' from source: role '' defaults 24971 1727096425.25392: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096425.25441: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096425.25449: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096425.25503: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096425.25644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096425.27575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096425.27578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096425.27580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096425.27582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096425.27584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096425.27636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.27673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.27705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.27749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.27774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.27822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.27853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.27887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.27931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.27952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.28141: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24971 1727096425.28218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.28235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.28258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.28297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.28307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.28368: variable 'ansible_python' from source: facts 24971 1727096425.28387: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24971 1727096425.28442: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096425.28499: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096425.28617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.28620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.28623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.28632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.28643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.28680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096425.28700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096425.28716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.28742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096425.28752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096425.28842: variable 'network_connections' from source: task vars 24971 1727096425.28848: variable 'interface' from source: play vars 24971 1727096425.28903: variable 'interface' from source: play vars 24971 1727096425.28978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096425.29103: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096425.29138: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096425.29170: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096425.29201: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096425.29243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096425.29265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096425.29291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096425.29313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096425.29350: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096425.29523: variable 'network_connections' from source: task vars 24971 1727096425.29527: variable 'interface' from source: play vars 24971 1727096425.29583: variable 'interface' from source: play vars 24971 1727096425.29617: variable '__network_packages_default_wireless' from source: role '' defaults 24971 1727096425.29670: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096425.29851: variable 'network_connections' from source: task vars 24971 1727096425.29854: variable 'interface' from source: play vars 24971 1727096425.29908: variable 'interface' from source: play vars 24971 1727096425.29928: variable '__network_packages_default_team' from source: role '' defaults 24971 1727096425.29983: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096425.30163: variable 'network_connections' from source: task vars 24971 1727096425.30167: variable 'interface' from source: play vars 24971 1727096425.30223: variable 'interface' from source: play vars 24971 1727096425.30263: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096425.30308: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096425.30312: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096425.30357: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096425.30490: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24971 1727096425.30784: variable 'network_connections' from source: task vars 24971 1727096425.30787: variable 'interface' from source: play vars 24971 1727096425.30830: variable 'interface' from source: play vars 24971 1727096425.30838: variable 'ansible_distribution' from source: facts 24971 1727096425.30840: variable '__network_rh_distros' from source: role '' defaults 24971 1727096425.30846: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.30862: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24971 1727096425.30978: variable 'ansible_distribution' from source: facts 24971 1727096425.30981: variable '__network_rh_distros' from source: role '' defaults 24971 1727096425.30984: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.30997: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24971 1727096425.31106: variable 'ansible_distribution' from source: facts 24971 1727096425.31109: variable '__network_rh_distros' from source: role '' defaults 24971 1727096425.31114: variable 'ansible_distribution_major_version' from source: facts 24971 1727096425.31140: variable 'network_provider' from source: set_fact 24971 1727096425.31156: variable 'omit' from source: magic vars 24971 1727096425.31180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096425.31198: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096425.31214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096425.31228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096425.31236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096425.31257: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096425.31260: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.31263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.31330: Set connection var ansible_shell_type to sh 24971 1727096425.31338: Set connection var ansible_shell_executable to /bin/sh 24971 1727096425.31347: Set connection var ansible_timeout to 10 24971 1727096425.31350: Set connection var ansible_connection to ssh 24971 1727096425.31356: Set connection var ansible_pipelining to False 24971 1727096425.31360: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096425.31381: variable 'ansible_shell_executable' from source: unknown 24971 1727096425.31384: variable 'ansible_connection' from source: unknown 24971 1727096425.31386: variable 'ansible_module_compression' from source: unknown 24971 1727096425.31389: variable 'ansible_shell_type' from source: unknown 24971 1727096425.31391: variable 'ansible_shell_executable' from source: unknown 24971 1727096425.31393: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096425.31396: variable 'ansible_pipelining' from source: unknown 24971 1727096425.31399: variable 'ansible_timeout' from source: unknown 24971 1727096425.31403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096425.31473: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096425.31482: variable 'omit' from source: magic vars 24971 1727096425.31489: starting attempt loop 24971 1727096425.31492: running the handler 24971 1727096425.31543: variable 'ansible_facts' from source: unknown 24971 1727096425.31931: _low_level_execute_command(): starting 24971 1727096425.31937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096425.32421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.32424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.32427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.32429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.32482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096425.32486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.32530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096425.34159: stdout chunk (state=3): >>>/root <<< 24971 1727096425.34257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096425.34288: stderr chunk (state=3): >>><<< 24971 1727096425.34291: stdout chunk (state=3): >>><<< 24971 1727096425.34309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096425.34318: _low_level_execute_command(): starting 24971 1727096425.34322: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474 `" && echo ansible-tmp-1727096425.343083-25545-130762925484474="` echo /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474 `" ) && sleep 0' 24971 1727096425.34741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096425.34744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096425.34746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.34749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.34750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.34807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096425.34812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096425.34815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.34848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096425.36698: stdout chunk (state=3): >>>ansible-tmp-1727096425.343083-25545-130762925484474=/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474 <<< 24971 1727096425.36804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096425.36828: stderr chunk (state=3): >>><<< 24971 1727096425.36831: stdout chunk (state=3): >>><<< 24971 1727096425.36844: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096425.343083-25545-130762925484474=/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096425.36878: variable 'ansible_module_compression' from source: unknown 24971 1727096425.36919: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 24971 1727096425.36923: ANSIBALLZ: Acquiring lock 24971 1727096425.36926: ANSIBALLZ: Lock acquired: 139839577444416 24971 1727096425.36928: ANSIBALLZ: Creating module 24971 1727096425.59277: ANSIBALLZ: Writing module into payload 24971 1727096425.59333: ANSIBALLZ: Writing module 24971 1727096425.59362: ANSIBALLZ: Renaming module 24971 1727096425.59380: ANSIBALLZ: Done creating module 24971 1727096425.59423: variable 'ansible_facts' from source: unknown 24971 1727096425.59640: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py 24971 1727096425.59854: Sending initial data 24971 1727096425.59865: Sent initial data (155 bytes) 24971 1727096425.60303: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.60330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.60378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096425.60393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.60438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096425.62041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24971 1727096425.62060: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096425.62113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096425.62138: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpca_yznud /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py <<< 24971 1727096425.62141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py" <<< 24971 1727096425.62184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpca_yznud" to remote "/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py" <<< 24971 1727096425.63682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096425.63686: stdout chunk (state=3): >>><<< 24971 1727096425.63688: stderr chunk (state=3): >>><<< 24971 1727096425.63690: done transferring module to remote 24971 1727096425.63696: _low_level_execute_command(): starting 24971 1727096425.63707: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/ /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py && sleep 0' 24971 1727096425.64343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096425.64359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096425.64379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.64401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096425.64404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.64432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096425.64445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.64502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096425.64506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.64559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096425.66385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096425.66392: stdout chunk (state=3): >>><<< 24971 1727096425.66399: stderr chunk (state=3): >>><<< 24971 1727096425.66483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096425.66487: _low_level_execute_command(): starting 24971 1727096425.66490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/AnsiballZ_systemd.py && sleep 0' 24971 1727096425.67021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096425.67118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096425.67148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.67213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096425.96331: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323289600", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1704464000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24971 1727096425.96337: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24971 1727096425.98275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096425.98279: stdout chunk (state=3): >>><<< 24971 1727096425.98282: stderr chunk (state=3): >>><<< 24971 1727096425.98290: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323289600", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1704464000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096425.98436: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096425.98461: _low_level_execute_command(): starting 24971 1727096425.98472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096425.343083-25545-130762925484474/ > /dev/null 2>&1 && sleep 0' 24971 1727096425.99063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096425.99083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096425.99099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096425.99119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096425.99187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096425.99233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096425.99251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096425.99277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096425.99340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096426.01139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096426.01192: stderr chunk (state=3): >>><<< 24971 1727096426.01196: stdout chunk (state=3): >>><<< 24971 1727096426.01216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096426.01224: handler run complete 24971 1727096426.01298: attempt loop complete, returning result 24971 1727096426.01301: _execute() done 24971 1727096426.01303: dumping result to json 24971 1727096426.01329: done dumping result, returning 24971 1727096426.01339: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-3482-6844-000000000023] 24971 1727096426.01342: sending task result for task 0afff68d-5257-3482-6844-000000000023 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096426.01693: no more pending results, returning what we have 24971 1727096426.01696: results queue empty 24971 1727096426.01697: checking for any_errors_fatal 24971 1727096426.01703: done checking for any_errors_fatal 24971 1727096426.01704: checking for max_fail_percentage 24971 1727096426.01705: done checking for max_fail_percentage 24971 1727096426.01706: checking to see if all hosts have failed and the running result is not ok 24971 1727096426.01707: done checking to see if all hosts have failed 24971 1727096426.01708: getting the remaining hosts for this loop 24971 1727096426.01709: done getting the remaining hosts for this loop 24971 1727096426.01715: getting the next task for host managed_node3 24971 1727096426.01721: done getting next task for host managed_node3 24971 1727096426.01725: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24971 1727096426.01727: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096426.01737: getting variables 24971 1727096426.01738: in VariableManager get_vars() 24971 1727096426.01777: Calling all_inventory to load vars for managed_node3 24971 1727096426.01780: Calling groups_inventory to load vars for managed_node3 24971 1727096426.01782: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096426.01794: Calling all_plugins_play to load vars for managed_node3 24971 1727096426.01797: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096426.01800: Calling groups_plugins_play to load vars for managed_node3 24971 1727096426.02408: done sending task result for task 0afff68d-5257-3482-6844-000000000023 24971 1727096426.02412: WORKER PROCESS EXITING 24971 1727096426.03562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096426.05366: done with get_vars() 24971 1727096426.05391: done getting variables 24971 1727096426.05448: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:00:26 -0400 (0:00:00.818) 0:00:13.532 ****** 24971 1727096426.05486: entering _queue_task() for managed_node3/service 24971 1727096426.05999: worker is 1 (out of 1 available) 24971 1727096426.06006: exiting _queue_task() for managed_node3/service 24971 1727096426.06016: done queuing things up, now waiting for results queue to drain 24971 1727096426.06017: waiting for pending results... 24971 1727096426.06145: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24971 1727096426.06183: in run() - task 0afff68d-5257-3482-6844-000000000024 24971 1727096426.06202: variable 'ansible_search_path' from source: unknown 24971 1727096426.06209: variable 'ansible_search_path' from source: unknown 24971 1727096426.06254: calling self._execute() 24971 1727096426.06340: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.06358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.06374: variable 'omit' from source: magic vars 24971 1727096426.06724: variable 'ansible_distribution_major_version' from source: facts 24971 1727096426.06740: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096426.06863: variable 'network_provider' from source: set_fact 24971 1727096426.06875: Evaluated conditional (network_provider == "nm"): True 24971 1727096426.06977: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096426.07076: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096426.07250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096426.09286: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096426.09394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096426.09399: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096426.09439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096426.09474: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096426.09572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096426.09614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096426.09647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096426.09722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096426.09726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096426.09767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096426.09800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096426.09872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096426.09884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096426.09904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096426.09958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096426.09989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096426.10019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096426.10158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096426.10161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096426.10228: variable 'network_connections' from source: task vars 24971 1727096426.10246: variable 'interface' from source: play vars 24971 1727096426.10326: variable 'interface' from source: play vars 24971 1727096426.10411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096426.10590: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096426.10633: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096426.10673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096426.10714: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096426.10760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096426.10791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096426.10824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096426.10850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096426.10899: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096426.11137: variable 'network_connections' from source: task vars 24971 1727096426.11147: variable 'interface' from source: play vars 24971 1727096426.11210: variable 'interface' from source: play vars 24971 1727096426.11256: Evaluated conditional (__network_wpa_supplicant_required): False 24971 1727096426.11262: when evaluation is False, skipping this task 24971 1727096426.11271: _execute() done 24971 1727096426.11277: dumping result to json 24971 1727096426.11283: done dumping result, returning 24971 1727096426.11292: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-3482-6844-000000000024] 24971 1727096426.11308: sending task result for task 0afff68d-5257-3482-6844-000000000024 24971 1727096426.11605: done sending task result for task 0afff68d-5257-3482-6844-000000000024 24971 1727096426.11609: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24971 1727096426.11650: no more pending results, returning what we have 24971 1727096426.11653: results queue empty 24971 1727096426.11654: checking for any_errors_fatal 24971 1727096426.11673: done checking for any_errors_fatal 24971 1727096426.11674: checking for max_fail_percentage 24971 1727096426.11676: done checking for max_fail_percentage 24971 1727096426.11677: checking to see if all hosts have failed and the running result is not ok 24971 1727096426.11678: done checking to see if all hosts have failed 24971 1727096426.11679: getting the remaining hosts for this loop 24971 1727096426.11680: done getting the remaining hosts for this loop 24971 1727096426.11684: getting the next task for host managed_node3 24971 1727096426.11689: done getting next task for host managed_node3 24971 1727096426.11693: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24971 1727096426.11695: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096426.11708: getting variables 24971 1727096426.11710: in VariableManager get_vars() 24971 1727096426.11752: Calling all_inventory to load vars for managed_node3 24971 1727096426.11755: Calling groups_inventory to load vars for managed_node3 24971 1727096426.11757: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096426.11766: Calling all_plugins_play to load vars for managed_node3 24971 1727096426.11772: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096426.11776: Calling groups_plugins_play to load vars for managed_node3 24971 1727096426.13620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096426.15366: done with get_vars() 24971 1727096426.15387: done getting variables 24971 1727096426.15440: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:00:26 -0400 (0:00:00.099) 0:00:13.632 ****** 24971 1727096426.15474: entering _queue_task() for managed_node3/service 24971 1727096426.15900: worker is 1 (out of 1 available) 24971 1727096426.15911: exiting _queue_task() for managed_node3/service 24971 1727096426.15922: done queuing things up, now waiting for results queue to drain 24971 1727096426.15923: waiting for pending results... 24971 1727096426.16019: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 24971 1727096426.16147: in run() - task 0afff68d-5257-3482-6844-000000000025 24971 1727096426.16172: variable 'ansible_search_path' from source: unknown 24971 1727096426.16181: variable 'ansible_search_path' from source: unknown 24971 1727096426.16259: calling self._execute() 24971 1727096426.16317: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.16328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.16340: variable 'omit' from source: magic vars 24971 1727096426.16714: variable 'ansible_distribution_major_version' from source: facts 24971 1727096426.16732: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096426.16850: variable 'network_provider' from source: set_fact 24971 1727096426.16911: Evaluated conditional (network_provider == "initscripts"): False 24971 1727096426.16914: when evaluation is False, skipping this task 24971 1727096426.16917: _execute() done 24971 1727096426.16920: dumping result to json 24971 1727096426.16922: done dumping result, returning 24971 1727096426.16924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-3482-6844-000000000025] 24971 1727096426.16926: sending task result for task 0afff68d-5257-3482-6844-000000000025 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096426.17053: no more pending results, returning what we have 24971 1727096426.17056: results queue empty 24971 1727096426.17057: checking for any_errors_fatal 24971 1727096426.17066: done checking for any_errors_fatal 24971 1727096426.17067: checking for max_fail_percentage 24971 1727096426.17070: done checking for max_fail_percentage 24971 1727096426.17071: checking to see if all hosts have failed and the running result is not ok 24971 1727096426.17072: done checking to see if all hosts have failed 24971 1727096426.17072: getting the remaining hosts for this loop 24971 1727096426.17074: done getting the remaining hosts for this loop 24971 1727096426.17077: getting the next task for host managed_node3 24971 1727096426.17084: done getting next task for host managed_node3 24971 1727096426.17087: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24971 1727096426.17090: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096426.17105: getting variables 24971 1727096426.17107: in VariableManager get_vars() 24971 1727096426.17145: Calling all_inventory to load vars for managed_node3 24971 1727096426.17148: Calling groups_inventory to load vars for managed_node3 24971 1727096426.17150: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096426.17161: Calling all_plugins_play to load vars for managed_node3 24971 1727096426.17164: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096426.17274: Calling groups_plugins_play to load vars for managed_node3 24971 1727096426.17287: done sending task result for task 0afff68d-5257-3482-6844-000000000025 24971 1727096426.17290: WORKER PROCESS EXITING 24971 1727096426.18765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096426.20974: done with get_vars() 24971 1727096426.20999: done getting variables 24971 1727096426.21058: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:00:26 -0400 (0:00:00.056) 0:00:13.688 ****** 24971 1727096426.21092: entering _queue_task() for managed_node3/copy 24971 1727096426.21399: worker is 1 (out of 1 available) 24971 1727096426.21410: exiting _queue_task() for managed_node3/copy 24971 1727096426.21532: done queuing things up, now waiting for results queue to drain 24971 1727096426.21533: waiting for pending results... 24971 1727096426.21632: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24971 1727096426.21758: in run() - task 0afff68d-5257-3482-6844-000000000026 24971 1727096426.21778: variable 'ansible_search_path' from source: unknown 24971 1727096426.21785: variable 'ansible_search_path' from source: unknown 24971 1727096426.21819: calling self._execute() 24971 1727096426.21909: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.22003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.22030: variable 'omit' from source: magic vars 24971 1727096426.22371: variable 'ansible_distribution_major_version' from source: facts 24971 1727096426.22389: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096426.22506: variable 'network_provider' from source: set_fact 24971 1727096426.22628: Evaluated conditional (network_provider == "initscripts"): False 24971 1727096426.22632: when evaluation is False, skipping this task 24971 1727096426.22634: _execute() done 24971 1727096426.22637: dumping result to json 24971 1727096426.22639: done dumping result, returning 24971 1727096426.22642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-3482-6844-000000000026] 24971 1727096426.22645: sending task result for task 0afff68d-5257-3482-6844-000000000026 24971 1727096426.22715: done sending task result for task 0afff68d-5257-3482-6844-000000000026 24971 1727096426.22719: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24971 1727096426.22778: no more pending results, returning what we have 24971 1727096426.22781: results queue empty 24971 1727096426.22782: checking for any_errors_fatal 24971 1727096426.22790: done checking for any_errors_fatal 24971 1727096426.22790: checking for max_fail_percentage 24971 1727096426.22792: done checking for max_fail_percentage 24971 1727096426.22793: checking to see if all hosts have failed and the running result is not ok 24971 1727096426.22794: done checking to see if all hosts have failed 24971 1727096426.22794: getting the remaining hosts for this loop 24971 1727096426.22796: done getting the remaining hosts for this loop 24971 1727096426.22799: getting the next task for host managed_node3 24971 1727096426.22806: done getting next task for host managed_node3 24971 1727096426.22809: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24971 1727096426.22812: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096426.22828: getting variables 24971 1727096426.22830: in VariableManager get_vars() 24971 1727096426.22872: Calling all_inventory to load vars for managed_node3 24971 1727096426.22874: Calling groups_inventory to load vars for managed_node3 24971 1727096426.22877: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096426.22888: Calling all_plugins_play to load vars for managed_node3 24971 1727096426.22891: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096426.22894: Calling groups_plugins_play to load vars for managed_node3 24971 1727096426.25635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096426.28452: done with get_vars() 24971 1727096426.28475: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:00:26 -0400 (0:00:00.074) 0:00:13.763 ****** 24971 1727096426.28553: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24971 1727096426.28555: Creating lock for fedora.linux_system_roles.network_connections 24971 1727096426.29156: worker is 1 (out of 1 available) 24971 1727096426.29370: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24971 1727096426.29383: done queuing things up, now waiting for results queue to drain 24971 1727096426.29384: waiting for pending results... 24971 1727096426.29980: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24971 1727096426.30284: in run() - task 0afff68d-5257-3482-6844-000000000027 24971 1727096426.30422: variable 'ansible_search_path' from source: unknown 24971 1727096426.30426: variable 'ansible_search_path' from source: unknown 24971 1727096426.30429: calling self._execute() 24971 1727096426.30512: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.30648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.30665: variable 'omit' from source: magic vars 24971 1727096426.31349: variable 'ansible_distribution_major_version' from source: facts 24971 1727096426.31415: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096426.31619: variable 'omit' from source: magic vars 24971 1727096426.31622: variable 'omit' from source: magic vars 24971 1727096426.31901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096426.36196: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096426.36432: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096426.36435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096426.36650: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096426.36653: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096426.36771: variable 'network_provider' from source: set_fact 24971 1727096426.37009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096426.37059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096426.37107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096426.37220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096426.37294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096426.37632: variable 'omit' from source: magic vars 24971 1727096426.37977: variable 'omit' from source: magic vars 24971 1727096426.38085: variable 'network_connections' from source: task vars 24971 1727096426.38088: variable 'interface' from source: play vars 24971 1727096426.38090: variable 'interface' from source: play vars 24971 1727096426.38378: variable 'omit' from source: magic vars 24971 1727096426.38421: variable '__lsr_ansible_managed' from source: task vars 24971 1727096426.38485: variable '__lsr_ansible_managed' from source: task vars 24971 1727096426.39123: Loaded config def from plugin (lookup/template) 24971 1727096426.39133: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24971 1727096426.39164: File lookup term: get_ansible_managed.j2 24971 1727096426.39375: variable 'ansible_search_path' from source: unknown 24971 1727096426.39379: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24971 1727096426.39383: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24971 1727096426.39393: variable 'ansible_search_path' from source: unknown 24971 1727096426.52167: variable 'ansible_managed' from source: unknown 24971 1727096426.52676: variable 'omit' from source: magic vars 24971 1727096426.52680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096426.52682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096426.52684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096426.52686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096426.52688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096426.52690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096426.52692: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.52694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.52905: Set connection var ansible_shell_type to sh 24971 1727096426.52919: Set connection var ansible_shell_executable to /bin/sh 24971 1727096426.52934: Set connection var ansible_timeout to 10 24971 1727096426.52944: Set connection var ansible_connection to ssh 24971 1727096426.52952: Set connection var ansible_pipelining to False 24971 1727096426.52961: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096426.52993: variable 'ansible_shell_executable' from source: unknown 24971 1727096426.53082: variable 'ansible_connection' from source: unknown 24971 1727096426.53093: variable 'ansible_module_compression' from source: unknown 24971 1727096426.53101: variable 'ansible_shell_type' from source: unknown 24971 1727096426.53109: variable 'ansible_shell_executable' from source: unknown 24971 1727096426.53115: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096426.53123: variable 'ansible_pipelining' from source: unknown 24971 1727096426.53130: variable 'ansible_timeout' from source: unknown 24971 1727096426.53137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096426.53303: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096426.53441: variable 'omit' from source: magic vars 24971 1727096426.53454: starting attempt loop 24971 1727096426.53575: running the handler 24971 1727096426.53578: _low_level_execute_command(): starting 24971 1727096426.53581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096426.55257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096426.55261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096426.55263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096426.55265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096426.55315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096426.55405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096426.55686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096426.55740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096426.57492: stdout chunk (state=3): >>>/root <<< 24971 1727096426.57506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096426.57543: stderr chunk (state=3): >>><<< 24971 1727096426.57552: stdout chunk (state=3): >>><<< 24971 1727096426.57883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096426.57886: _low_level_execute_command(): starting 24971 1727096426.57890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330 `" && echo ansible-tmp-1727096426.5779188-25581-115415859189330="` echo /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330 `" ) && sleep 0' 24971 1727096426.58890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096426.58904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096426.58915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096426.58979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096426.59193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096426.59284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096426.61197: stdout chunk (state=3): >>>ansible-tmp-1727096426.5779188-25581-115415859189330=/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330 <<< 24971 1727096426.61399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096426.61403: stdout chunk (state=3): >>><<< 24971 1727096426.61405: stderr chunk (state=3): >>><<< 24971 1727096426.61425: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096426.5779188-25581-115415859189330=/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096426.61501: variable 'ansible_module_compression' from source: unknown 24971 1727096426.61680: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 24971 1727096426.61684: ANSIBALLZ: Acquiring lock 24971 1727096426.61686: ANSIBALLZ: Lock acquired: 139839573400256 24971 1727096426.61690: ANSIBALLZ: Creating module 24971 1727096426.93281: ANSIBALLZ: Writing module into payload 24971 1727096426.93880: ANSIBALLZ: Writing module 24971 1727096426.93901: ANSIBALLZ: Renaming module 24971 1727096426.93907: ANSIBALLZ: Done creating module 24971 1727096426.93932: variable 'ansible_facts' from source: unknown 24971 1727096426.94044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py 24971 1727096426.94520: Sending initial data 24971 1727096426.94524: Sent initial data (168 bytes) 24971 1727096426.95744: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096426.95856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096426.95859: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096426.95978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096426.95982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096426.95985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096426.96021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096426.97619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24971 1727096426.97624: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096426.97678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096426.97708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpgznfva0t /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py <<< 24971 1727096426.97711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py" <<< 24971 1727096426.97773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpgznfva0t" to remote "/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py" <<< 24971 1727096426.98824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096426.98828: stdout chunk (state=3): >>><<< 24971 1727096426.98830: stderr chunk (state=3): >>><<< 24971 1727096426.98832: done transferring module to remote 24971 1727096426.98834: _low_level_execute_command(): starting 24971 1727096426.98836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/ /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py && sleep 0' 24971 1727096426.99673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096426.99708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096426.99814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096426.99855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096426.99938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096427.01762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096427.01779: stdout chunk (state=3): >>><<< 24971 1727096427.01790: stderr chunk (state=3): >>><<< 24971 1727096427.01809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096427.01816: _low_level_execute_command(): starting 24971 1727096427.01824: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/AnsiballZ_network_connections.py && sleep 0' 24971 1727096427.02424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096427.02440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096427.02458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096427.02482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096427.02502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096427.02601: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096427.02635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096427.02702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.07590: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24971 1727096429.09690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096429.09758: stderr chunk (state=3): >>><<< 24971 1727096429.09775: stdout chunk (state=3): >>><<< 24971 1727096429.09799: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096429.09853: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096429.09876: _low_level_execute_command(): starting 24971 1727096429.09887: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096426.5779188-25581-115415859189330/ > /dev/null 2>&1 && sleep 0' 24971 1727096429.10583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.10620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.10637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096429.10658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.10725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.12872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096429.12893: stderr chunk (state=3): >>><<< 24971 1727096429.12901: stdout chunk (state=3): >>><<< 24971 1727096429.12925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096429.12941: handler run complete 24971 1727096429.13074: attempt loop complete, returning result 24971 1727096429.13077: _execute() done 24971 1727096429.13080: dumping result to json 24971 1727096429.13082: done dumping result, returning 24971 1727096429.13084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-3482-6844-000000000027] 24971 1727096429.13086: sending task result for task 0afff68d-5257-3482-6844-000000000027 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active) 24971 1727096429.13486: no more pending results, returning what we have 24971 1727096429.13489: results queue empty 24971 1727096429.13490: checking for any_errors_fatal 24971 1727096429.13497: done checking for any_errors_fatal 24971 1727096429.13498: checking for max_fail_percentage 24971 1727096429.13500: done checking for max_fail_percentage 24971 1727096429.13501: checking to see if all hosts have failed and the running result is not ok 24971 1727096429.13502: done checking to see if all hosts have failed 24971 1727096429.13502: getting the remaining hosts for this loop 24971 1727096429.13504: done getting the remaining hosts for this loop 24971 1727096429.13507: getting the next task for host managed_node3 24971 1727096429.13513: done getting next task for host managed_node3 24971 1727096429.13517: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24971 1727096429.13520: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096429.13530: getting variables 24971 1727096429.13531: in VariableManager get_vars() 24971 1727096429.13577: Calling all_inventory to load vars for managed_node3 24971 1727096429.13579: Calling groups_inventory to load vars for managed_node3 24971 1727096429.13582: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096429.13588: done sending task result for task 0afff68d-5257-3482-6844-000000000027 24971 1727096429.13591: WORKER PROCESS EXITING 24971 1727096429.13601: Calling all_plugins_play to load vars for managed_node3 24971 1727096429.13604: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096429.13606: Calling groups_plugins_play to load vars for managed_node3 24971 1727096429.16055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096429.17625: done with get_vars() 24971 1727096429.17653: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:00:29 -0400 (0:00:02.891) 0:00:16.655 ****** 24971 1727096429.17748: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24971 1727096429.17750: Creating lock for fedora.linux_system_roles.network_state 24971 1727096429.18901: worker is 1 (out of 1 available) 24971 1727096429.18911: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24971 1727096429.18922: done queuing things up, now waiting for results queue to drain 24971 1727096429.18923: waiting for pending results... 24971 1727096429.19390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 24971 1727096429.19499: in run() - task 0afff68d-5257-3482-6844-000000000028 24971 1727096429.19517: variable 'ansible_search_path' from source: unknown 24971 1727096429.19521: variable 'ansible_search_path' from source: unknown 24971 1727096429.19555: calling self._execute() 24971 1727096429.19651: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.19655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.19666: variable 'omit' from source: magic vars 24971 1727096429.20029: variable 'ansible_distribution_major_version' from source: facts 24971 1727096429.20043: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096429.20164: variable 'network_state' from source: role '' defaults 24971 1727096429.20178: Evaluated conditional (network_state != {}): False 24971 1727096429.20181: when evaluation is False, skipping this task 24971 1727096429.20184: _execute() done 24971 1727096429.20187: dumping result to json 24971 1727096429.20189: done dumping result, returning 24971 1727096429.20192: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-3482-6844-000000000028] 24971 1727096429.20198: sending task result for task 0afff68d-5257-3482-6844-000000000028 24971 1727096429.20290: done sending task result for task 0afff68d-5257-3482-6844-000000000028 24971 1727096429.20292: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096429.20337: no more pending results, returning what we have 24971 1727096429.20341: results queue empty 24971 1727096429.20342: checking for any_errors_fatal 24971 1727096429.20352: done checking for any_errors_fatal 24971 1727096429.20352: checking for max_fail_percentage 24971 1727096429.20354: done checking for max_fail_percentage 24971 1727096429.20355: checking to see if all hosts have failed and the running result is not ok 24971 1727096429.20356: done checking to see if all hosts have failed 24971 1727096429.20356: getting the remaining hosts for this loop 24971 1727096429.20358: done getting the remaining hosts for this loop 24971 1727096429.20361: getting the next task for host managed_node3 24971 1727096429.20371: done getting next task for host managed_node3 24971 1727096429.20375: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24971 1727096429.20378: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096429.20395: getting variables 24971 1727096429.20396: in VariableManager get_vars() 24971 1727096429.20432: Calling all_inventory to load vars for managed_node3 24971 1727096429.20434: Calling groups_inventory to load vars for managed_node3 24971 1727096429.20436: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096429.20445: Calling all_plugins_play to load vars for managed_node3 24971 1727096429.20447: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096429.20449: Calling groups_plugins_play to load vars for managed_node3 24971 1727096429.22154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096429.23852: done with get_vars() 24971 1727096429.23877: done getting variables 24971 1727096429.23939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:00:29 -0400 (0:00:00.062) 0:00:16.717 ****** 24971 1727096429.23979: entering _queue_task() for managed_node3/debug 24971 1727096429.24276: worker is 1 (out of 1 available) 24971 1727096429.24401: exiting _queue_task() for managed_node3/debug 24971 1727096429.24412: done queuing things up, now waiting for results queue to drain 24971 1727096429.24413: waiting for pending results... 24971 1727096429.24850: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24971 1727096429.24856: in run() - task 0afff68d-5257-3482-6844-000000000029 24971 1727096429.24860: variable 'ansible_search_path' from source: unknown 24971 1727096429.24863: variable 'ansible_search_path' from source: unknown 24971 1727096429.24880: calling self._execute() 24971 1727096429.24989: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.24993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.25077: variable 'omit' from source: magic vars 24971 1727096429.25875: variable 'ansible_distribution_major_version' from source: facts 24971 1727096429.25886: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096429.25892: variable 'omit' from source: magic vars 24971 1727096429.26002: variable 'omit' from source: magic vars 24971 1727096429.26111: variable 'omit' from source: magic vars 24971 1727096429.26275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096429.26278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096429.26376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096429.26379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.26382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.26384: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096429.26386: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.26388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.26559: Set connection var ansible_shell_type to sh 24971 1727096429.26573: Set connection var ansible_shell_executable to /bin/sh 24971 1727096429.26582: Set connection var ansible_timeout to 10 24971 1727096429.26587: Set connection var ansible_connection to ssh 24971 1727096429.26592: Set connection var ansible_pipelining to False 24971 1727096429.26598: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096429.26695: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.26698: variable 'ansible_connection' from source: unknown 24971 1727096429.26701: variable 'ansible_module_compression' from source: unknown 24971 1727096429.26704: variable 'ansible_shell_type' from source: unknown 24971 1727096429.26706: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.26708: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.26713: variable 'ansible_pipelining' from source: unknown 24971 1727096429.26875: variable 'ansible_timeout' from source: unknown 24971 1727096429.26878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.27049: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096429.27059: variable 'omit' from source: magic vars 24971 1727096429.27062: starting attempt loop 24971 1727096429.27065: running the handler 24971 1727096429.27420: variable '__network_connections_result' from source: set_fact 24971 1727096429.27517: handler run complete 24971 1727096429.27535: attempt loop complete, returning result 24971 1727096429.27538: _execute() done 24971 1727096429.27540: dumping result to json 24971 1727096429.27550: done dumping result, returning 24971 1727096429.27560: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-3482-6844-000000000029] 24971 1727096429.27563: sending task result for task 0afff68d-5257-3482-6844-000000000029 24971 1727096429.27652: done sending task result for task 0afff68d-5257-3482-6844-000000000029 24971 1727096429.27656: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active)" ] } 24971 1727096429.27724: no more pending results, returning what we have 24971 1727096429.27728: results queue empty 24971 1727096429.27729: checking for any_errors_fatal 24971 1727096429.27735: done checking for any_errors_fatal 24971 1727096429.27735: checking for max_fail_percentage 24971 1727096429.27737: done checking for max_fail_percentage 24971 1727096429.27738: checking to see if all hosts have failed and the running result is not ok 24971 1727096429.27739: done checking to see if all hosts have failed 24971 1727096429.27740: getting the remaining hosts for this loop 24971 1727096429.27741: done getting the remaining hosts for this loop 24971 1727096429.27745: getting the next task for host managed_node3 24971 1727096429.27751: done getting next task for host managed_node3 24971 1727096429.27754: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24971 1727096429.27757: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096429.27770: getting variables 24971 1727096429.27772: in VariableManager get_vars() 24971 1727096429.27811: Calling all_inventory to load vars for managed_node3 24971 1727096429.27813: Calling groups_inventory to load vars for managed_node3 24971 1727096429.27815: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096429.27826: Calling all_plugins_play to load vars for managed_node3 24971 1727096429.27829: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096429.27832: Calling groups_plugins_play to load vars for managed_node3 24971 1727096429.29492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096429.32800: done with get_vars() 24971 1727096429.32822: done getting variables 24971 1727096429.32882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:00:29 -0400 (0:00:00.089) 0:00:16.806 ****** 24971 1727096429.32916: entering _queue_task() for managed_node3/debug 24971 1727096429.33205: worker is 1 (out of 1 available) 24971 1727096429.33221: exiting _queue_task() for managed_node3/debug 24971 1727096429.33234: done queuing things up, now waiting for results queue to drain 24971 1727096429.33235: waiting for pending results... 24971 1727096429.33589: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24971 1727096429.33642: in run() - task 0afff68d-5257-3482-6844-00000000002a 24971 1727096429.33675: variable 'ansible_search_path' from source: unknown 24971 1727096429.33679: variable 'ansible_search_path' from source: unknown 24971 1727096429.33708: calling self._execute() 24971 1727096429.33797: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.33802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.33816: variable 'omit' from source: magic vars 24971 1727096429.34184: variable 'ansible_distribution_major_version' from source: facts 24971 1727096429.34196: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096429.34253: variable 'omit' from source: magic vars 24971 1727096429.34260: variable 'omit' from source: magic vars 24971 1727096429.34295: variable 'omit' from source: magic vars 24971 1727096429.34339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096429.34377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096429.34396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096429.34412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.34474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.34561: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096429.34564: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.34566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.34974: Set connection var ansible_shell_type to sh 24971 1727096429.34978: Set connection var ansible_shell_executable to /bin/sh 24971 1727096429.34980: Set connection var ansible_timeout to 10 24971 1727096429.34982: Set connection var ansible_connection to ssh 24971 1727096429.34984: Set connection var ansible_pipelining to False 24971 1727096429.34986: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096429.34988: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.34990: variable 'ansible_connection' from source: unknown 24971 1727096429.34992: variable 'ansible_module_compression' from source: unknown 24971 1727096429.34994: variable 'ansible_shell_type' from source: unknown 24971 1727096429.34996: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.34998: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.35000: variable 'ansible_pipelining' from source: unknown 24971 1727096429.35002: variable 'ansible_timeout' from source: unknown 24971 1727096429.35081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.35329: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096429.35338: variable 'omit' from source: magic vars 24971 1727096429.35341: starting attempt loop 24971 1727096429.35344: running the handler 24971 1727096429.35475: variable '__network_connections_result' from source: set_fact 24971 1727096429.35500: variable '__network_connections_result' from source: set_fact 24971 1727096429.35915: handler run complete 24971 1727096429.35944: attempt loop complete, returning result 24971 1727096429.36017: _execute() done 24971 1727096429.36020: dumping result to json 24971 1727096429.36023: done dumping result, returning 24971 1727096429.36032: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-3482-6844-00000000002a] 24971 1727096429.36037: sending task result for task 0afff68d-5257-3482-6844-00000000002a 24971 1727096429.36243: done sending task result for task 0afff68d-5257-3482-6844-00000000002a 24971 1727096429.36249: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 02b26351-6d86-4c58-9ebc-fea256f8cb97 (not-active)" ] } } 24971 1727096429.36341: no more pending results, returning what we have 24971 1727096429.36345: results queue empty 24971 1727096429.36346: checking for any_errors_fatal 24971 1727096429.36353: done checking for any_errors_fatal 24971 1727096429.36354: checking for max_fail_percentage 24971 1727096429.36356: done checking for max_fail_percentage 24971 1727096429.36357: checking to see if all hosts have failed and the running result is not ok 24971 1727096429.36358: done checking to see if all hosts have failed 24971 1727096429.36359: getting the remaining hosts for this loop 24971 1727096429.36360: done getting the remaining hosts for this loop 24971 1727096429.36364: getting the next task for host managed_node3 24971 1727096429.36373: done getting next task for host managed_node3 24971 1727096429.36377: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24971 1727096429.36380: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096429.36392: getting variables 24971 1727096429.36394: in VariableManager get_vars() 24971 1727096429.36434: Calling all_inventory to load vars for managed_node3 24971 1727096429.36443: Calling groups_inventory to load vars for managed_node3 24971 1727096429.36446: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096429.36459: Calling all_plugins_play to load vars for managed_node3 24971 1727096429.36462: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096429.36465: Calling groups_plugins_play to load vars for managed_node3 24971 1727096429.37983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096429.39557: done with get_vars() 24971 1727096429.39579: done getting variables 24971 1727096429.39639: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:00:29 -0400 (0:00:00.067) 0:00:16.874 ****** 24971 1727096429.39671: entering _queue_task() for managed_node3/debug 24971 1727096429.39942: worker is 1 (out of 1 available) 24971 1727096429.39955: exiting _queue_task() for managed_node3/debug 24971 1727096429.39969: done queuing things up, now waiting for results queue to drain 24971 1727096429.39970: waiting for pending results... 24971 1727096429.40287: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24971 1727096429.40361: in run() - task 0afff68d-5257-3482-6844-00000000002b 24971 1727096429.40385: variable 'ansible_search_path' from source: unknown 24971 1727096429.40394: variable 'ansible_search_path' from source: unknown 24971 1727096429.40432: calling self._execute() 24971 1727096429.40520: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.40576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.40581: variable 'omit' from source: magic vars 24971 1727096429.40897: variable 'ansible_distribution_major_version' from source: facts 24971 1727096429.40914: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096429.41038: variable 'network_state' from source: role '' defaults 24971 1727096429.41053: Evaluated conditional (network_state != {}): False 24971 1727096429.41060: when evaluation is False, skipping this task 24971 1727096429.41071: _execute() done 24971 1727096429.41078: dumping result to json 24971 1727096429.41175: done dumping result, returning 24971 1727096429.41178: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-3482-6844-00000000002b] 24971 1727096429.41181: sending task result for task 0afff68d-5257-3482-6844-00000000002b 24971 1727096429.41242: done sending task result for task 0afff68d-5257-3482-6844-00000000002b 24971 1727096429.41245: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 24971 1727096429.41293: no more pending results, returning what we have 24971 1727096429.41296: results queue empty 24971 1727096429.41297: checking for any_errors_fatal 24971 1727096429.41308: done checking for any_errors_fatal 24971 1727096429.41309: checking for max_fail_percentage 24971 1727096429.41311: done checking for max_fail_percentage 24971 1727096429.41312: checking to see if all hosts have failed and the running result is not ok 24971 1727096429.41312: done checking to see if all hosts have failed 24971 1727096429.41313: getting the remaining hosts for this loop 24971 1727096429.41315: done getting the remaining hosts for this loop 24971 1727096429.41318: getting the next task for host managed_node3 24971 1727096429.41323: done getting next task for host managed_node3 24971 1727096429.41326: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24971 1727096429.41329: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096429.41342: getting variables 24971 1727096429.41343: in VariableManager get_vars() 24971 1727096429.41377: Calling all_inventory to load vars for managed_node3 24971 1727096429.41379: Calling groups_inventory to load vars for managed_node3 24971 1727096429.41381: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096429.41389: Calling all_plugins_play to load vars for managed_node3 24971 1727096429.41392: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096429.41394: Calling groups_plugins_play to load vars for managed_node3 24971 1727096429.42790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096429.45034: done with get_vars() 24971 1727096429.45058: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:00:29 -0400 (0:00:00.055) 0:00:16.930 ****** 24971 1727096429.45248: entering _queue_task() for managed_node3/ping 24971 1727096429.45250: Creating lock for ping 24971 1727096429.45845: worker is 1 (out of 1 available) 24971 1727096429.45859: exiting _queue_task() for managed_node3/ping 24971 1727096429.45874: done queuing things up, now waiting for results queue to drain 24971 1727096429.45875: waiting for pending results... 24971 1727096429.46386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 24971 1727096429.46476: in run() - task 0afff68d-5257-3482-6844-00000000002c 24971 1727096429.46901: variable 'ansible_search_path' from source: unknown 24971 1727096429.46905: variable 'ansible_search_path' from source: unknown 24971 1727096429.46974: calling self._execute() 24971 1727096429.47161: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.47165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.47181: variable 'omit' from source: magic vars 24971 1727096429.47919: variable 'ansible_distribution_major_version' from source: facts 24971 1727096429.47930: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096429.47971: variable 'omit' from source: magic vars 24971 1727096429.48102: variable 'omit' from source: magic vars 24971 1727096429.48132: variable 'omit' from source: magic vars 24971 1727096429.48167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096429.48402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096429.48405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096429.48408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.48410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096429.48413: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096429.48415: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.48481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.48584: Set connection var ansible_shell_type to sh 24971 1727096429.48592: Set connection var ansible_shell_executable to /bin/sh 24971 1727096429.48602: Set connection var ansible_timeout to 10 24971 1727096429.48607: Set connection var ansible_connection to ssh 24971 1727096429.48615: Set connection var ansible_pipelining to False 24971 1727096429.48736: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096429.48838: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.48841: variable 'ansible_connection' from source: unknown 24971 1727096429.48844: variable 'ansible_module_compression' from source: unknown 24971 1727096429.48847: variable 'ansible_shell_type' from source: unknown 24971 1727096429.48849: variable 'ansible_shell_executable' from source: unknown 24971 1727096429.48852: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096429.48855: variable 'ansible_pipelining' from source: unknown 24971 1727096429.48857: variable 'ansible_timeout' from source: unknown 24971 1727096429.48859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096429.49308: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096429.49317: variable 'omit' from source: magic vars 24971 1727096429.49320: starting attempt loop 24971 1727096429.49322: running the handler 24971 1727096429.49373: _low_level_execute_command(): starting 24971 1727096429.49381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096429.50281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096429.50293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096429.50305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096429.50362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096429.50365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096429.50369: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096429.50374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.50439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.50462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.50533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.52204: stdout chunk (state=3): >>>/root <<< 24971 1727096429.52354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096429.52357: stdout chunk (state=3): >>><<< 24971 1727096429.52360: stderr chunk (state=3): >>><<< 24971 1727096429.52381: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096429.52398: _low_level_execute_command(): starting 24971 1727096429.52476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064 `" && echo ansible-tmp-1727096429.5238767-25699-132126584984064="` echo /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064 `" ) && sleep 0' 24971 1727096429.53019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096429.53033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096429.53047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096429.53137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.53179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.53196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096429.53214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.53294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.55185: stdout chunk (state=3): >>>ansible-tmp-1727096429.5238767-25699-132126584984064=/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064 <<< 24971 1727096429.55343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096429.55347: stdout chunk (state=3): >>><<< 24971 1727096429.55349: stderr chunk (state=3): >>><<< 24971 1727096429.55370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096429.5238767-25699-132126584984064=/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096429.55573: variable 'ansible_module_compression' from source: unknown 24971 1727096429.55576: ANSIBALLZ: Using lock for ping 24971 1727096429.55579: ANSIBALLZ: Acquiring lock 24971 1727096429.55581: ANSIBALLZ: Lock acquired: 139839575495248 24971 1727096429.55583: ANSIBALLZ: Creating module 24971 1727096429.73407: ANSIBALLZ: Writing module into payload 24971 1727096429.73483: ANSIBALLZ: Writing module 24971 1727096429.73511: ANSIBALLZ: Renaming module 24971 1727096429.73523: ANSIBALLZ: Done creating module 24971 1727096429.73544: variable 'ansible_facts' from source: unknown 24971 1727096429.73627: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py 24971 1727096429.73819: Sending initial data 24971 1727096429.73823: Sent initial data (153 bytes) 24971 1727096429.74546: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096429.74678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096429.74699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.74887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.76428: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096429.76486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096429.76523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py" <<< 24971 1727096429.76716: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpddskrdme /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py <<< 24971 1727096429.76721: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpddskrdme" to remote "/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py" <<< 24971 1727096429.77877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096429.77881: stdout chunk (state=3): >>><<< 24971 1727096429.77883: stderr chunk (state=3): >>><<< 24971 1727096429.78063: done transferring module to remote 24971 1727096429.78067: _low_level_execute_command(): starting 24971 1727096429.78076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/ /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py && sleep 0' 24971 1727096429.79137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096429.79151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096429.79284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.79347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.79492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.79521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.81404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096429.81423: stdout chunk (state=3): >>><<< 24971 1727096429.81436: stderr chunk (state=3): >>><<< 24971 1727096429.81511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096429.81620: _low_level_execute_command(): starting 24971 1727096429.81626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/AnsiballZ_ping.py && sleep 0' 24971 1727096429.82582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096429.82586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096429.82589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.82838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.82862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.82865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096429.83065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096429.83108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096429.98139: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24971 1727096429.99406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096429.99429: stderr chunk (state=3): >>><<< 24971 1727096429.99432: stdout chunk (state=3): >>><<< 24971 1727096429.99449: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096429.99478: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096429.99487: _low_level_execute_command(): starting 24971 1727096429.99491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096429.5238767-25699-132126584984064/ > /dev/null 2>&1 && sleep 0' 24971 1727096429.99924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096429.99928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.99931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096429.99933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096429.99990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096429.99994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096429.99997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.00022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.01810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.01835: stderr chunk (state=3): >>><<< 24971 1727096430.01838: stdout chunk (state=3): >>><<< 24971 1727096430.01856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.01863: handler run complete 24971 1727096430.01879: attempt loop complete, returning result 24971 1727096430.01882: _execute() done 24971 1727096430.01885: dumping result to json 24971 1727096430.01887: done dumping result, returning 24971 1727096430.01895: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-3482-6844-00000000002c] 24971 1727096430.01899: sending task result for task 0afff68d-5257-3482-6844-00000000002c 24971 1727096430.01985: done sending task result for task 0afff68d-5257-3482-6844-00000000002c 24971 1727096430.01988: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 24971 1727096430.02043: no more pending results, returning what we have 24971 1727096430.02046: results queue empty 24971 1727096430.02047: checking for any_errors_fatal 24971 1727096430.02054: done checking for any_errors_fatal 24971 1727096430.02054: checking for max_fail_percentage 24971 1727096430.02056: done checking for max_fail_percentage 24971 1727096430.02057: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.02058: done checking to see if all hosts have failed 24971 1727096430.02058: getting the remaining hosts for this loop 24971 1727096430.02060: done getting the remaining hosts for this loop 24971 1727096430.02064: getting the next task for host managed_node3 24971 1727096430.02076: done getting next task for host managed_node3 24971 1727096430.02078: ^ task is: TASK: meta (role_complete) 24971 1727096430.02081: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.02091: getting variables 24971 1727096430.02093: in VariableManager get_vars() 24971 1727096430.02136: Calling all_inventory to load vars for managed_node3 24971 1727096430.02138: Calling groups_inventory to load vars for managed_node3 24971 1727096430.02140: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.02150: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.02152: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.02155: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.02979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.03836: done with get_vars() 24971 1727096430.03854: done getting variables 24971 1727096430.03915: done queuing things up, now waiting for results queue to drain 24971 1727096430.03917: results queue empty 24971 1727096430.03917: checking for any_errors_fatal 24971 1727096430.03919: done checking for any_errors_fatal 24971 1727096430.03919: checking for max_fail_percentage 24971 1727096430.03920: done checking for max_fail_percentage 24971 1727096430.03920: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.03921: done checking to see if all hosts have failed 24971 1727096430.03921: getting the remaining hosts for this loop 24971 1727096430.03922: done getting the remaining hosts for this loop 24971 1727096430.03924: getting the next task for host managed_node3 24971 1727096430.03926: done getting next task for host managed_node3 24971 1727096430.03928: ^ task is: TASK: Include the task 'assert_device_present.yml' 24971 1727096430.03929: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.03930: getting variables 24971 1727096430.03931: in VariableManager get_vars() 24971 1727096430.03940: Calling all_inventory to load vars for managed_node3 24971 1727096430.03942: Calling groups_inventory to load vars for managed_node3 24971 1727096430.03943: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.03946: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.03948: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.03949: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.04640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.05479: done with get_vars() 24971 1727096430.05494: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Monday 23 September 2024 09:00:30 -0400 (0:00:00.602) 0:00:17.533 ****** 24971 1727096430.05545: entering _queue_task() for managed_node3/include_tasks 24971 1727096430.05819: worker is 1 (out of 1 available) 24971 1727096430.05838: exiting _queue_task() for managed_node3/include_tasks 24971 1727096430.05849: done queuing things up, now waiting for results queue to drain 24971 1727096430.05850: waiting for pending results... 24971 1727096430.06022: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 24971 1727096430.06091: in run() - task 0afff68d-5257-3482-6844-00000000005c 24971 1727096430.06102: variable 'ansible_search_path' from source: unknown 24971 1727096430.06132: calling self._execute() 24971 1727096430.06205: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.06209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.06218: variable 'omit' from source: magic vars 24971 1727096430.06506: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.06519: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.06522: _execute() done 24971 1727096430.06525: dumping result to json 24971 1727096430.06528: done dumping result, returning 24971 1727096430.06533: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-3482-6844-00000000005c] 24971 1727096430.06537: sending task result for task 0afff68d-5257-3482-6844-00000000005c 24971 1727096430.06650: no more pending results, returning what we have 24971 1727096430.06655: in VariableManager get_vars() 24971 1727096430.06698: Calling all_inventory to load vars for managed_node3 24971 1727096430.06701: Calling groups_inventory to load vars for managed_node3 24971 1727096430.06703: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.06714: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.06717: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.06720: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.07239: done sending task result for task 0afff68d-5257-3482-6844-00000000005c 24971 1727096430.07243: WORKER PROCESS EXITING 24971 1727096430.07508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.08423: done with get_vars() 24971 1727096430.08435: variable 'ansible_search_path' from source: unknown 24971 1727096430.08446: we have included files to process 24971 1727096430.08446: generating all_blocks data 24971 1727096430.08447: done generating all_blocks data 24971 1727096430.08452: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24971 1727096430.08453: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24971 1727096430.08454: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24971 1727096430.08561: in VariableManager get_vars() 24971 1727096430.08581: done with get_vars() 24971 1727096430.08653: done processing included file 24971 1727096430.08654: iterating over new_blocks loaded from include file 24971 1727096430.08655: in VariableManager get_vars() 24971 1727096430.08666: done with get_vars() 24971 1727096430.08670: filtering new block on tags 24971 1727096430.08685: done filtering new block on tags 24971 1727096430.08687: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 24971 1727096430.08690: extending task lists for all hosts with included blocks 24971 1727096430.09864: done extending task lists 24971 1727096430.09865: done processing included files 24971 1727096430.09866: results queue empty 24971 1727096430.09866: checking for any_errors_fatal 24971 1727096430.09871: done checking for any_errors_fatal 24971 1727096430.09871: checking for max_fail_percentage 24971 1727096430.09872: done checking for max_fail_percentage 24971 1727096430.09872: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.09873: done checking to see if all hosts have failed 24971 1727096430.09873: getting the remaining hosts for this loop 24971 1727096430.09874: done getting the remaining hosts for this loop 24971 1727096430.09876: getting the next task for host managed_node3 24971 1727096430.09880: done getting next task for host managed_node3 24971 1727096430.09882: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24971 1727096430.09883: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.09885: getting variables 24971 1727096430.09886: in VariableManager get_vars() 24971 1727096430.09895: Calling all_inventory to load vars for managed_node3 24971 1727096430.09896: Calling groups_inventory to load vars for managed_node3 24971 1727096430.09897: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.09901: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.09903: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.09904: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.10516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.11344: done with get_vars() 24971 1727096430.11358: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:00:30 -0400 (0:00:00.058) 0:00:17.591 ****** 24971 1727096430.11414: entering _queue_task() for managed_node3/include_tasks 24971 1727096430.11653: worker is 1 (out of 1 available) 24971 1727096430.11665: exiting _queue_task() for managed_node3/include_tasks 24971 1727096430.11682: done queuing things up, now waiting for results queue to drain 24971 1727096430.11683: waiting for pending results... 24971 1727096430.11851: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 24971 1727096430.11925: in run() - task 0afff68d-5257-3482-6844-0000000002b5 24971 1727096430.11936: variable 'ansible_search_path' from source: unknown 24971 1727096430.11939: variable 'ansible_search_path' from source: unknown 24971 1727096430.11972: calling self._execute() 24971 1727096430.12038: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.12042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.12051: variable 'omit' from source: magic vars 24971 1727096430.12316: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.12326: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.12332: _execute() done 24971 1727096430.12335: dumping result to json 24971 1727096430.12339: done dumping result, returning 24971 1727096430.12342: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-3482-6844-0000000002b5] 24971 1727096430.12352: sending task result for task 0afff68d-5257-3482-6844-0000000002b5 24971 1727096430.12429: done sending task result for task 0afff68d-5257-3482-6844-0000000002b5 24971 1727096430.12432: WORKER PROCESS EXITING 24971 1727096430.12485: no more pending results, returning what we have 24971 1727096430.12490: in VariableManager get_vars() 24971 1727096430.12534: Calling all_inventory to load vars for managed_node3 24971 1727096430.12536: Calling groups_inventory to load vars for managed_node3 24971 1727096430.12538: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.12548: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.12550: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.12552: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.13389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.14235: done with get_vars() 24971 1727096430.14249: variable 'ansible_search_path' from source: unknown 24971 1727096430.14250: variable 'ansible_search_path' from source: unknown 24971 1727096430.14281: we have included files to process 24971 1727096430.14282: generating all_blocks data 24971 1727096430.14283: done generating all_blocks data 24971 1727096430.14284: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24971 1727096430.14285: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24971 1727096430.14286: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24971 1727096430.14439: done processing included file 24971 1727096430.14440: iterating over new_blocks loaded from include file 24971 1727096430.14441: in VariableManager get_vars() 24971 1727096430.14453: done with get_vars() 24971 1727096430.14454: filtering new block on tags 24971 1727096430.14463: done filtering new block on tags 24971 1727096430.14464: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 24971 1727096430.14472: extending task lists for all hosts with included blocks 24971 1727096430.14532: done extending task lists 24971 1727096430.14533: done processing included files 24971 1727096430.14533: results queue empty 24971 1727096430.14533: checking for any_errors_fatal 24971 1727096430.14535: done checking for any_errors_fatal 24971 1727096430.14536: checking for max_fail_percentage 24971 1727096430.14536: done checking for max_fail_percentage 24971 1727096430.14537: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.14537: done checking to see if all hosts have failed 24971 1727096430.14538: getting the remaining hosts for this loop 24971 1727096430.14539: done getting the remaining hosts for this loop 24971 1727096430.14540: getting the next task for host managed_node3 24971 1727096430.14543: done getting next task for host managed_node3 24971 1727096430.14544: ^ task is: TASK: Get stat for interface {{ interface }} 24971 1727096430.14546: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.14547: getting variables 24971 1727096430.14548: in VariableManager get_vars() 24971 1727096430.14556: Calling all_inventory to load vars for managed_node3 24971 1727096430.14558: Calling groups_inventory to load vars for managed_node3 24971 1727096430.14559: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.14563: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.14564: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.14566: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.15206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.16035: done with get_vars() 24971 1727096430.16049: done getting variables 24971 1727096430.16162: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:00:30 -0400 (0:00:00.047) 0:00:17.639 ****** 24971 1727096430.16188: entering _queue_task() for managed_node3/stat 24971 1727096430.16437: worker is 1 (out of 1 available) 24971 1727096430.16451: exiting _queue_task() for managed_node3/stat 24971 1727096430.16462: done queuing things up, now waiting for results queue to drain 24971 1727096430.16463: waiting for pending results... 24971 1727096430.16633: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 24971 1727096430.16709: in run() - task 0afff68d-5257-3482-6844-0000000003a0 24971 1727096430.16720: variable 'ansible_search_path' from source: unknown 24971 1727096430.16724: variable 'ansible_search_path' from source: unknown 24971 1727096430.16751: calling self._execute() 24971 1727096430.16821: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.16824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.16833: variable 'omit' from source: magic vars 24971 1727096430.17087: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.17097: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.17103: variable 'omit' from source: magic vars 24971 1727096430.17137: variable 'omit' from source: magic vars 24971 1727096430.17203: variable 'interface' from source: play vars 24971 1727096430.17217: variable 'omit' from source: magic vars 24971 1727096430.17250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096430.17278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096430.17294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096430.17306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.17317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.17341: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096430.17345: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.17347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.17414: Set connection var ansible_shell_type to sh 24971 1727096430.17421: Set connection var ansible_shell_executable to /bin/sh 24971 1727096430.17429: Set connection var ansible_timeout to 10 24971 1727096430.17434: Set connection var ansible_connection to ssh 24971 1727096430.17438: Set connection var ansible_pipelining to False 24971 1727096430.17443: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096430.17463: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.17470: variable 'ansible_connection' from source: unknown 24971 1727096430.17474: variable 'ansible_module_compression' from source: unknown 24971 1727096430.17476: variable 'ansible_shell_type' from source: unknown 24971 1727096430.17479: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.17481: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.17483: variable 'ansible_pipelining' from source: unknown 24971 1727096430.17485: variable 'ansible_timeout' from source: unknown 24971 1727096430.17487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.17622: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096430.17630: variable 'omit' from source: magic vars 24971 1727096430.17634: starting attempt loop 24971 1727096430.17637: running the handler 24971 1727096430.17649: _low_level_execute_command(): starting 24971 1727096430.17656: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096430.18149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.18186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.18190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.18192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.18194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096430.18196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.18249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.18252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.18256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.18298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.19947: stdout chunk (state=3): >>>/root <<< 24971 1727096430.20037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.20065: stderr chunk (state=3): >>><<< 24971 1727096430.20072: stdout chunk (state=3): >>><<< 24971 1727096430.20093: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.20103: _low_level_execute_command(): starting 24971 1727096430.20112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941 `" && echo ansible-tmp-1727096430.2009263-25730-253337885153941="` echo /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941 `" ) && sleep 0' 24971 1727096430.20528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.20561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096430.20565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.20580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096430.20583: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096430.20585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.20632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.20659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.20663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.20670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.22533: stdout chunk (state=3): >>>ansible-tmp-1727096430.2009263-25730-253337885153941=/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941 <<< 24971 1727096430.22640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.22671: stderr chunk (state=3): >>><<< 24971 1727096430.22675: stdout chunk (state=3): >>><<< 24971 1727096430.22693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096430.2009263-25730-253337885153941=/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.22730: variable 'ansible_module_compression' from source: unknown 24971 1727096430.22774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24971 1727096430.22802: variable 'ansible_facts' from source: unknown 24971 1727096430.22861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py 24971 1727096430.22958: Sending initial data 24971 1727096430.22962: Sent initial data (153 bytes) 24971 1727096430.23406: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.23409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.23412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096430.23414: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.23416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.23465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.23478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.23508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.25071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096430.25095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096430.25128: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpqbhj6l5e /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py <<< 24971 1727096430.25134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py" <<< 24971 1727096430.25165: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpqbhj6l5e" to remote "/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py" <<< 24971 1727096430.25170: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py" <<< 24971 1727096430.25653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.25701: stderr chunk (state=3): >>><<< 24971 1727096430.25704: stdout chunk (state=3): >>><<< 24971 1727096430.25744: done transferring module to remote 24971 1727096430.25753: _low_level_execute_command(): starting 24971 1727096430.25758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/ /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py && sleep 0' 24971 1727096430.26204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.26207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.26209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.26212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.26218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.26266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.26280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.26304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.28055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.28084: stderr chunk (state=3): >>><<< 24971 1727096430.28087: stdout chunk (state=3): >>><<< 24971 1727096430.28102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.28106: _low_level_execute_command(): starting 24971 1727096430.28109: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/AnsiballZ_stat.py && sleep 0' 24971 1727096430.28546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.28549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.28552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096430.28554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096430.28556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.28606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.28615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.28618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.28652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.43797: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31664, "dev": 23, "nlink": 1, "atime": 1727096419.6950097, "mtime": 1727096419.6950097, "ctime": 1727096419.6950097, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24971 1727096430.45030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096430.45059: stderr chunk (state=3): >>><<< 24971 1727096430.45062: stdout chunk (state=3): >>><<< 24971 1727096430.45082: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31664, "dev": 23, "nlink": 1, "atime": 1727096419.6950097, "mtime": 1727096419.6950097, "ctime": 1727096419.6950097, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096430.45120: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096430.45128: _low_level_execute_command(): starting 24971 1727096430.45134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096430.2009263-25730-253337885153941/ > /dev/null 2>&1 && sleep 0' 24971 1727096430.45572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.45577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096430.45607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.45611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096430.45613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.45615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.45683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.45688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.45692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.45714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.47503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.47526: stderr chunk (state=3): >>><<< 24971 1727096430.47529: stdout chunk (state=3): >>><<< 24971 1727096430.47540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.47550: handler run complete 24971 1727096430.47585: attempt loop complete, returning result 24971 1727096430.47588: _execute() done 24971 1727096430.47590: dumping result to json 24971 1727096430.47594: done dumping result, returning 24971 1727096430.47602: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0afff68d-5257-3482-6844-0000000003a0] 24971 1727096430.47606: sending task result for task 0afff68d-5257-3482-6844-0000000003a0 24971 1727096430.47704: done sending task result for task 0afff68d-5257-3482-6844-0000000003a0 24971 1727096430.47707: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096419.6950097, "block_size": 4096, "blocks": 0, "ctime": 1727096419.6950097, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31664, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727096419.6950097, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 24971 1727096430.47794: no more pending results, returning what we have 24971 1727096430.47798: results queue empty 24971 1727096430.47799: checking for any_errors_fatal 24971 1727096430.47801: done checking for any_errors_fatal 24971 1727096430.47802: checking for max_fail_percentage 24971 1727096430.47804: done checking for max_fail_percentage 24971 1727096430.47805: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.47805: done checking to see if all hosts have failed 24971 1727096430.47806: getting the remaining hosts for this loop 24971 1727096430.47807: done getting the remaining hosts for this loop 24971 1727096430.47811: getting the next task for host managed_node3 24971 1727096430.47825: done getting next task for host managed_node3 24971 1727096430.47827: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 24971 1727096430.47830: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.47835: getting variables 24971 1727096430.47836: in VariableManager get_vars() 24971 1727096430.47878: Calling all_inventory to load vars for managed_node3 24971 1727096430.47880: Calling groups_inventory to load vars for managed_node3 24971 1727096430.47882: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.47893: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.47895: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.47898: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.52544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.54064: done with get_vars() 24971 1727096430.54092: done getting variables 24971 1727096430.54159: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24971 1727096430.54230: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:00:30 -0400 (0:00:00.380) 0:00:18.020 ****** 24971 1727096430.54253: entering _queue_task() for managed_node3/assert 24971 1727096430.54254: Creating lock for assert 24971 1727096430.54509: worker is 1 (out of 1 available) 24971 1727096430.54521: exiting _queue_task() for managed_node3/assert 24971 1727096430.54532: done queuing things up, now waiting for results queue to drain 24971 1727096430.54533: waiting for pending results... 24971 1727096430.54718: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 24971 1727096430.54787: in run() - task 0afff68d-5257-3482-6844-0000000002b6 24971 1727096430.54798: variable 'ansible_search_path' from source: unknown 24971 1727096430.54803: variable 'ansible_search_path' from source: unknown 24971 1727096430.54832: calling self._execute() 24971 1727096430.54908: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.54912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.54921: variable 'omit' from source: magic vars 24971 1727096430.55192: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.55200: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.55211: variable 'omit' from source: magic vars 24971 1727096430.55237: variable 'omit' from source: magic vars 24971 1727096430.55310: variable 'interface' from source: play vars 24971 1727096430.55326: variable 'omit' from source: magic vars 24971 1727096430.55356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096430.55384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096430.55399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096430.55414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.55426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.55449: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096430.55452: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.55455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.55522: Set connection var ansible_shell_type to sh 24971 1727096430.55533: Set connection var ansible_shell_executable to /bin/sh 24971 1727096430.55539: Set connection var ansible_timeout to 10 24971 1727096430.55544: Set connection var ansible_connection to ssh 24971 1727096430.55549: Set connection var ansible_pipelining to False 24971 1727096430.55554: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096430.55627: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.55631: variable 'ansible_connection' from source: unknown 24971 1727096430.55635: variable 'ansible_module_compression' from source: unknown 24971 1727096430.55638: variable 'ansible_shell_type' from source: unknown 24971 1727096430.55640: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.55642: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.55644: variable 'ansible_pipelining' from source: unknown 24971 1727096430.55647: variable 'ansible_timeout' from source: unknown 24971 1727096430.55649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.55691: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096430.55699: variable 'omit' from source: magic vars 24971 1727096430.55703: starting attempt loop 24971 1727096430.55706: running the handler 24971 1727096430.55797: variable 'interface_stat' from source: set_fact 24971 1727096430.55811: Evaluated conditional (interface_stat.stat.exists): True 24971 1727096430.55817: handler run complete 24971 1727096430.55827: attempt loop complete, returning result 24971 1727096430.55830: _execute() done 24971 1727096430.55832: dumping result to json 24971 1727096430.55834: done dumping result, returning 24971 1727096430.55842: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0afff68d-5257-3482-6844-0000000002b6] 24971 1727096430.55846: sending task result for task 0afff68d-5257-3482-6844-0000000002b6 24971 1727096430.55921: done sending task result for task 0afff68d-5257-3482-6844-0000000002b6 24971 1727096430.55924: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096430.56003: no more pending results, returning what we have 24971 1727096430.56006: results queue empty 24971 1727096430.56007: checking for any_errors_fatal 24971 1727096430.56016: done checking for any_errors_fatal 24971 1727096430.56017: checking for max_fail_percentage 24971 1727096430.56018: done checking for max_fail_percentage 24971 1727096430.56019: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.56020: done checking to see if all hosts have failed 24971 1727096430.56021: getting the remaining hosts for this loop 24971 1727096430.56022: done getting the remaining hosts for this loop 24971 1727096430.56025: getting the next task for host managed_node3 24971 1727096430.56032: done getting next task for host managed_node3 24971 1727096430.56037: ^ task is: TASK: Include the task 'assert_profile_present.yml' 24971 1727096430.56039: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.56043: getting variables 24971 1727096430.56044: in VariableManager get_vars() 24971 1727096430.56080: Calling all_inventory to load vars for managed_node3 24971 1727096430.56082: Calling groups_inventory to load vars for managed_node3 24971 1727096430.56084: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.56092: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.56095: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.56097: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.56822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.57759: done with get_vars() 24971 1727096430.57777: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Monday 23 September 2024 09:00:30 -0400 (0:00:00.035) 0:00:18.056 ****** 24971 1727096430.57836: entering _queue_task() for managed_node3/include_tasks 24971 1727096430.58031: worker is 1 (out of 1 available) 24971 1727096430.58045: exiting _queue_task() for managed_node3/include_tasks 24971 1727096430.58058: done queuing things up, now waiting for results queue to drain 24971 1727096430.58059: waiting for pending results... 24971 1727096430.58215: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 24971 1727096430.58276: in run() - task 0afff68d-5257-3482-6844-00000000005d 24971 1727096430.58289: variable 'ansible_search_path' from source: unknown 24971 1727096430.58316: calling self._execute() 24971 1727096430.58380: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.58384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.58396: variable 'omit' from source: magic vars 24971 1727096430.58640: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.58650: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.58656: _execute() done 24971 1727096430.58659: dumping result to json 24971 1727096430.58662: done dumping result, returning 24971 1727096430.58672: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-3482-6844-00000000005d] 24971 1727096430.58675: sending task result for task 0afff68d-5257-3482-6844-00000000005d 24971 1727096430.58757: done sending task result for task 0afff68d-5257-3482-6844-00000000005d 24971 1727096430.58760: WORKER PROCESS EXITING 24971 1727096430.58791: no more pending results, returning what we have 24971 1727096430.58796: in VariableManager get_vars() 24971 1727096430.58833: Calling all_inventory to load vars for managed_node3 24971 1727096430.58835: Calling groups_inventory to load vars for managed_node3 24971 1727096430.58837: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.58846: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.58848: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.58851: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.59589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.60440: done with get_vars() 24971 1727096430.60452: variable 'ansible_search_path' from source: unknown 24971 1727096430.60462: we have included files to process 24971 1727096430.60462: generating all_blocks data 24971 1727096430.60464: done generating all_blocks data 24971 1727096430.60470: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 24971 1727096430.60471: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 24971 1727096430.60473: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 24971 1727096430.60596: in VariableManager get_vars() 24971 1727096430.60613: done with get_vars() 24971 1727096430.60779: done processing included file 24971 1727096430.60781: iterating over new_blocks loaded from include file 24971 1727096430.60782: in VariableManager get_vars() 24971 1727096430.60793: done with get_vars() 24971 1727096430.60794: filtering new block on tags 24971 1727096430.60806: done filtering new block on tags 24971 1727096430.60807: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 24971 1727096430.60810: extending task lists for all hosts with included blocks 24971 1727096430.62007: done extending task lists 24971 1727096430.62008: done processing included files 24971 1727096430.62009: results queue empty 24971 1727096430.62009: checking for any_errors_fatal 24971 1727096430.62011: done checking for any_errors_fatal 24971 1727096430.62011: checking for max_fail_percentage 24971 1727096430.62012: done checking for max_fail_percentage 24971 1727096430.62013: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.62013: done checking to see if all hosts have failed 24971 1727096430.62013: getting the remaining hosts for this loop 24971 1727096430.62014: done getting the remaining hosts for this loop 24971 1727096430.62016: getting the next task for host managed_node3 24971 1727096430.62018: done getting next task for host managed_node3 24971 1727096430.62020: ^ task is: TASK: Include the task 'get_profile_stat.yml' 24971 1727096430.62022: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.62025: getting variables 24971 1727096430.62025: in VariableManager get_vars() 24971 1727096430.62035: Calling all_inventory to load vars for managed_node3 24971 1727096430.62037: Calling groups_inventory to load vars for managed_node3 24971 1727096430.62038: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.62042: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.62044: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.62045: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.62725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.63638: done with get_vars() 24971 1727096430.63651: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 09:00:30 -0400 (0:00:00.058) 0:00:18.114 ****** 24971 1727096430.63705: entering _queue_task() for managed_node3/include_tasks 24971 1727096430.63934: worker is 1 (out of 1 available) 24971 1727096430.63946: exiting _queue_task() for managed_node3/include_tasks 24971 1727096430.63958: done queuing things up, now waiting for results queue to drain 24971 1727096430.63959: waiting for pending results... 24971 1727096430.64123: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 24971 1727096430.64196: in run() - task 0afff68d-5257-3482-6844-0000000003b8 24971 1727096430.64203: variable 'ansible_search_path' from source: unknown 24971 1727096430.64207: variable 'ansible_search_path' from source: unknown 24971 1727096430.64234: calling self._execute() 24971 1727096430.64309: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.64314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.64322: variable 'omit' from source: magic vars 24971 1727096430.64584: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.64594: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.64601: _execute() done 24971 1727096430.64604: dumping result to json 24971 1727096430.64606: done dumping result, returning 24971 1727096430.64612: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-3482-6844-0000000003b8] 24971 1727096430.64616: sending task result for task 0afff68d-5257-3482-6844-0000000003b8 24971 1727096430.64698: done sending task result for task 0afff68d-5257-3482-6844-0000000003b8 24971 1727096430.64701: WORKER PROCESS EXITING 24971 1727096430.64753: no more pending results, returning what we have 24971 1727096430.64758: in VariableManager get_vars() 24971 1727096430.64803: Calling all_inventory to load vars for managed_node3 24971 1727096430.64806: Calling groups_inventory to load vars for managed_node3 24971 1727096430.64808: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.64817: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.64820: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.64822: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.65882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.67212: done with get_vars() 24971 1727096430.67224: variable 'ansible_search_path' from source: unknown 24971 1727096430.67225: variable 'ansible_search_path' from source: unknown 24971 1727096430.67248: we have included files to process 24971 1727096430.67249: generating all_blocks data 24971 1727096430.67250: done generating all_blocks data 24971 1727096430.67251: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24971 1727096430.67252: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24971 1727096430.67253: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24971 1727096430.67898: done processing included file 24971 1727096430.67900: iterating over new_blocks loaded from include file 24971 1727096430.67901: in VariableManager get_vars() 24971 1727096430.67912: done with get_vars() 24971 1727096430.67913: filtering new block on tags 24971 1727096430.67927: done filtering new block on tags 24971 1727096430.67929: in VariableManager get_vars() 24971 1727096430.67938: done with get_vars() 24971 1727096430.67939: filtering new block on tags 24971 1727096430.67950: done filtering new block on tags 24971 1727096430.67952: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 24971 1727096430.67955: extending task lists for all hosts with included blocks 24971 1727096430.68049: done extending task lists 24971 1727096430.68050: done processing included files 24971 1727096430.68051: results queue empty 24971 1727096430.68051: checking for any_errors_fatal 24971 1727096430.68053: done checking for any_errors_fatal 24971 1727096430.68054: checking for max_fail_percentage 24971 1727096430.68054: done checking for max_fail_percentage 24971 1727096430.68055: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.68055: done checking to see if all hosts have failed 24971 1727096430.68056: getting the remaining hosts for this loop 24971 1727096430.68056: done getting the remaining hosts for this loop 24971 1727096430.68058: getting the next task for host managed_node3 24971 1727096430.68060: done getting next task for host managed_node3 24971 1727096430.68062: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 24971 1727096430.68064: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.68065: getting variables 24971 1727096430.68065: in VariableManager get_vars() 24971 1727096430.68108: Calling all_inventory to load vars for managed_node3 24971 1727096430.68110: Calling groups_inventory to load vars for managed_node3 24971 1727096430.68111: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.68115: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.68116: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.68118: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.68750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.70240: done with get_vars() 24971 1727096430.70262: done getting variables 24971 1727096430.70303: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:00:30 -0400 (0:00:00.066) 0:00:18.181 ****** 24971 1727096430.70332: entering _queue_task() for managed_node3/set_fact 24971 1727096430.70637: worker is 1 (out of 1 available) 24971 1727096430.70649: exiting _queue_task() for managed_node3/set_fact 24971 1727096430.70661: done queuing things up, now waiting for results queue to drain 24971 1727096430.70661: waiting for pending results... 24971 1727096430.71105: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 24971 1727096430.71159: in run() - task 0afff68d-5257-3482-6844-0000000004b0 24971 1727096430.71183: variable 'ansible_search_path' from source: unknown 24971 1727096430.71191: variable 'ansible_search_path' from source: unknown 24971 1727096430.71235: calling self._execute() 24971 1727096430.71332: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.71342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.71355: variable 'omit' from source: magic vars 24971 1727096430.71741: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.71746: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.71748: variable 'omit' from source: magic vars 24971 1727096430.71793: variable 'omit' from source: magic vars 24971 1727096430.71830: variable 'omit' from source: magic vars 24971 1727096430.71877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096430.71959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096430.71963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096430.71965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.71980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.72012: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096430.72020: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.72027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.72127: Set connection var ansible_shell_type to sh 24971 1727096430.72141: Set connection var ansible_shell_executable to /bin/sh 24971 1727096430.72174: Set connection var ansible_timeout to 10 24971 1727096430.72177: Set connection var ansible_connection to ssh 24971 1727096430.72179: Set connection var ansible_pipelining to False 24971 1727096430.72182: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096430.72273: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.72278: variable 'ansible_connection' from source: unknown 24971 1727096430.72281: variable 'ansible_module_compression' from source: unknown 24971 1727096430.72283: variable 'ansible_shell_type' from source: unknown 24971 1727096430.72285: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.72288: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.72290: variable 'ansible_pipelining' from source: unknown 24971 1727096430.72293: variable 'ansible_timeout' from source: unknown 24971 1727096430.72296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.72410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096430.72414: variable 'omit' from source: magic vars 24971 1727096430.72416: starting attempt loop 24971 1727096430.72418: running the handler 24971 1727096430.72420: handler run complete 24971 1727096430.72435: attempt loop complete, returning result 24971 1727096430.72441: _execute() done 24971 1727096430.72447: dumping result to json 24971 1727096430.72456: done dumping result, returning 24971 1727096430.72518: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-3482-6844-0000000004b0] 24971 1727096430.72521: sending task result for task 0afff68d-5257-3482-6844-0000000004b0 24971 1727096430.72584: done sending task result for task 0afff68d-5257-3482-6844-0000000004b0 24971 1727096430.72587: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 24971 1727096430.72678: no more pending results, returning what we have 24971 1727096430.72682: results queue empty 24971 1727096430.72683: checking for any_errors_fatal 24971 1727096430.72685: done checking for any_errors_fatal 24971 1727096430.72686: checking for max_fail_percentage 24971 1727096430.72687: done checking for max_fail_percentage 24971 1727096430.72688: checking to see if all hosts have failed and the running result is not ok 24971 1727096430.72689: done checking to see if all hosts have failed 24971 1727096430.72690: getting the remaining hosts for this loop 24971 1727096430.72691: done getting the remaining hosts for this loop 24971 1727096430.72695: getting the next task for host managed_node3 24971 1727096430.72702: done getting next task for host managed_node3 24971 1727096430.72705: ^ task is: TASK: Stat profile file 24971 1727096430.72709: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096430.72713: getting variables 24971 1727096430.72715: in VariableManager get_vars() 24971 1727096430.72754: Calling all_inventory to load vars for managed_node3 24971 1727096430.72757: Calling groups_inventory to load vars for managed_node3 24971 1727096430.72760: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096430.72774: Calling all_plugins_play to load vars for managed_node3 24971 1727096430.72777: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096430.72781: Calling groups_plugins_play to load vars for managed_node3 24971 1727096430.74393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096430.75921: done with get_vars() 24971 1727096430.75942: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:00:30 -0400 (0:00:00.056) 0:00:18.238 ****** 24971 1727096430.76029: entering _queue_task() for managed_node3/stat 24971 1727096430.76311: worker is 1 (out of 1 available) 24971 1727096430.76326: exiting _queue_task() for managed_node3/stat 24971 1727096430.76339: done queuing things up, now waiting for results queue to drain 24971 1727096430.76340: waiting for pending results... 24971 1727096430.76648: running TaskExecutor() for managed_node3/TASK: Stat profile file 24971 1727096430.76701: in run() - task 0afff68d-5257-3482-6844-0000000004b1 24971 1727096430.76724: variable 'ansible_search_path' from source: unknown 24971 1727096430.76731: variable 'ansible_search_path' from source: unknown 24971 1727096430.76779: calling self._execute() 24971 1727096430.77075: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.77079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.77081: variable 'omit' from source: magic vars 24971 1727096430.77274: variable 'ansible_distribution_major_version' from source: facts 24971 1727096430.77292: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096430.77307: variable 'omit' from source: magic vars 24971 1727096430.77359: variable 'omit' from source: magic vars 24971 1727096430.77463: variable 'profile' from source: include params 24971 1727096430.77479: variable 'interface' from source: play vars 24971 1727096430.77553: variable 'interface' from source: play vars 24971 1727096430.77583: variable 'omit' from source: magic vars 24971 1727096430.77630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096430.77673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096430.77698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096430.77719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.77738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096430.77776: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096430.77784: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.77791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.77898: Set connection var ansible_shell_type to sh 24971 1727096430.77911: Set connection var ansible_shell_executable to /bin/sh 24971 1727096430.77928: Set connection var ansible_timeout to 10 24971 1727096430.77936: Set connection var ansible_connection to ssh 24971 1727096430.77945: Set connection var ansible_pipelining to False 24971 1727096430.77958: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096430.77987: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.78073: variable 'ansible_connection' from source: unknown 24971 1727096430.78077: variable 'ansible_module_compression' from source: unknown 24971 1727096430.78079: variable 'ansible_shell_type' from source: unknown 24971 1727096430.78081: variable 'ansible_shell_executable' from source: unknown 24971 1727096430.78083: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096430.78085: variable 'ansible_pipelining' from source: unknown 24971 1727096430.78087: variable 'ansible_timeout' from source: unknown 24971 1727096430.78090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096430.78232: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096430.78248: variable 'omit' from source: magic vars 24971 1727096430.78257: starting attempt loop 24971 1727096430.78263: running the handler 24971 1727096430.78290: _low_level_execute_command(): starting 24971 1727096430.78301: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096430.79027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096430.79046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.79062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.79084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096430.79102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096430.79113: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096430.79125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.79164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.79234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.79272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.79284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.79353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.81255: stdout chunk (state=3): >>>/root <<< 24971 1727096430.81259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.81261: stderr chunk (state=3): >>><<< 24971 1727096430.81279: stdout chunk (state=3): >>><<< 24971 1727096430.81311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.81330: _low_level_execute_command(): starting 24971 1727096430.81341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637 `" && echo ansible-tmp-1727096430.8131864-25750-220411353091637="` echo /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637 `" ) && sleep 0' 24971 1727096430.82059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096430.82080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.82102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.82120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096430.82135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096430.82145: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096430.82158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.82188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096430.82201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096430.82213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096430.82226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.82322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.82334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.82350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.82419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.84355: stdout chunk (state=3): >>>ansible-tmp-1727096430.8131864-25750-220411353091637=/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637 <<< 24971 1727096430.84488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.84511: stderr chunk (state=3): >>><<< 24971 1727096430.84525: stdout chunk (state=3): >>><<< 24971 1727096430.84588: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096430.8131864-25750-220411353091637=/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.84642: variable 'ansible_module_compression' from source: unknown 24971 1727096430.84718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24971 1727096430.84762: variable 'ansible_facts' from source: unknown 24971 1727096430.84875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py 24971 1727096430.85105: Sending initial data 24971 1727096430.85108: Sent initial data (153 bytes) 24971 1727096430.85692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.85747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.85763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096430.85791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.85859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.87474: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096430.87513: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096430.87554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp87xehxdj /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py <<< 24971 1727096430.87557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py" <<< 24971 1727096430.87599: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp87xehxdj" to remote "/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py" <<< 24971 1727096430.88388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.88391: stdout chunk (state=3): >>><<< 24971 1727096430.88393: stderr chunk (state=3): >>><<< 24971 1727096430.88398: done transferring module to remote 24971 1727096430.88408: _low_level_execute_command(): starting 24971 1727096430.88412: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/ /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py && sleep 0' 24971 1727096430.88860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.88864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096430.88873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.88876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096430.88878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096430.88880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.88923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.88926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.88962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096430.90875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096430.90910: stderr chunk (state=3): >>><<< 24971 1727096430.90913: stdout chunk (state=3): >>><<< 24971 1727096430.90928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096430.90964: _low_level_execute_command(): starting 24971 1727096430.90971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/AnsiballZ_stat.py && sleep 0' 24971 1727096430.91420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.91424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.91426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096430.91428: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096430.91430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096430.91481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096430.91484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096430.91525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.06706: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24971 1727096431.07994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096431.08023: stderr chunk (state=3): >>><<< 24971 1727096431.08026: stdout chunk (state=3): >>><<< 24971 1727096431.08041: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096431.08064: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096431.08075: _low_level_execute_command(): starting 24971 1727096431.08080: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096430.8131864-25750-220411353091637/ > /dev/null 2>&1 && sleep 0' 24971 1727096431.08537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096431.08541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096431.08548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096431.08553: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.08555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.08606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.08613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.08615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.08647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.10436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.10461: stderr chunk (state=3): >>><<< 24971 1727096431.10464: stdout chunk (state=3): >>><<< 24971 1727096431.10481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096431.10490: handler run complete 24971 1727096431.10507: attempt loop complete, returning result 24971 1727096431.10510: _execute() done 24971 1727096431.10513: dumping result to json 24971 1727096431.10515: done dumping result, returning 24971 1727096431.10522: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-3482-6844-0000000004b1] 24971 1727096431.10526: sending task result for task 0afff68d-5257-3482-6844-0000000004b1 24971 1727096431.10616: done sending task result for task 0afff68d-5257-3482-6844-0000000004b1 24971 1727096431.10619: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 24971 1727096431.10676: no more pending results, returning what we have 24971 1727096431.10679: results queue empty 24971 1727096431.10680: checking for any_errors_fatal 24971 1727096431.10686: done checking for any_errors_fatal 24971 1727096431.10686: checking for max_fail_percentage 24971 1727096431.10688: done checking for max_fail_percentage 24971 1727096431.10689: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.10690: done checking to see if all hosts have failed 24971 1727096431.10690: getting the remaining hosts for this loop 24971 1727096431.10692: done getting the remaining hosts for this loop 24971 1727096431.10695: getting the next task for host managed_node3 24971 1727096431.10701: done getting next task for host managed_node3 24971 1727096431.10704: ^ task is: TASK: Set NM profile exist flag based on the profile files 24971 1727096431.10708: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.10712: getting variables 24971 1727096431.10713: in VariableManager get_vars() 24971 1727096431.10753: Calling all_inventory to load vars for managed_node3 24971 1727096431.10756: Calling groups_inventory to load vars for managed_node3 24971 1727096431.10758: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.10772: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.10775: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.10777: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.11582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.12433: done with get_vars() 24971 1727096431.12449: done getting variables 24971 1727096431.12495: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:00:31 -0400 (0:00:00.364) 0:00:18.602 ****** 24971 1727096431.12519: entering _queue_task() for managed_node3/set_fact 24971 1727096431.12743: worker is 1 (out of 1 available) 24971 1727096431.12756: exiting _queue_task() for managed_node3/set_fact 24971 1727096431.12771: done queuing things up, now waiting for results queue to drain 24971 1727096431.12772: waiting for pending results... 24971 1727096431.12941: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 24971 1727096431.13012: in run() - task 0afff68d-5257-3482-6844-0000000004b2 24971 1727096431.13022: variable 'ansible_search_path' from source: unknown 24971 1727096431.13025: variable 'ansible_search_path' from source: unknown 24971 1727096431.13054: calling self._execute() 24971 1727096431.13127: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.13130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.13140: variable 'omit' from source: magic vars 24971 1727096431.13405: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.13415: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.13503: variable 'profile_stat' from source: set_fact 24971 1727096431.13513: Evaluated conditional (profile_stat.stat.exists): False 24971 1727096431.13516: when evaluation is False, skipping this task 24971 1727096431.13518: _execute() done 24971 1727096431.13521: dumping result to json 24971 1727096431.13524: done dumping result, returning 24971 1727096431.13529: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-3482-6844-0000000004b2] 24971 1727096431.13533: sending task result for task 0afff68d-5257-3482-6844-0000000004b2 24971 1727096431.13613: done sending task result for task 0afff68d-5257-3482-6844-0000000004b2 24971 1727096431.13616: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24971 1727096431.13691: no more pending results, returning what we have 24971 1727096431.13694: results queue empty 24971 1727096431.13695: checking for any_errors_fatal 24971 1727096431.13703: done checking for any_errors_fatal 24971 1727096431.13703: checking for max_fail_percentage 24971 1727096431.13705: done checking for max_fail_percentage 24971 1727096431.13706: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.13707: done checking to see if all hosts have failed 24971 1727096431.13708: getting the remaining hosts for this loop 24971 1727096431.13709: done getting the remaining hosts for this loop 24971 1727096431.13712: getting the next task for host managed_node3 24971 1727096431.13717: done getting next task for host managed_node3 24971 1727096431.13719: ^ task is: TASK: Get NM profile info 24971 1727096431.13723: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.13726: getting variables 24971 1727096431.13727: in VariableManager get_vars() 24971 1727096431.13759: Calling all_inventory to load vars for managed_node3 24971 1727096431.13761: Calling groups_inventory to load vars for managed_node3 24971 1727096431.13763: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.13775: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.13777: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.13780: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.14621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.15459: done with get_vars() 24971 1727096431.15477: done getting variables 24971 1727096431.15517: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:00:31 -0400 (0:00:00.030) 0:00:18.633 ****** 24971 1727096431.15537: entering _queue_task() for managed_node3/shell 24971 1727096431.15757: worker is 1 (out of 1 available) 24971 1727096431.15771: exiting _queue_task() for managed_node3/shell 24971 1727096431.15784: done queuing things up, now waiting for results queue to drain 24971 1727096431.15785: waiting for pending results... 24971 1727096431.15951: running TaskExecutor() for managed_node3/TASK: Get NM profile info 24971 1727096431.16019: in run() - task 0afff68d-5257-3482-6844-0000000004b3 24971 1727096431.16031: variable 'ansible_search_path' from source: unknown 24971 1727096431.16035: variable 'ansible_search_path' from source: unknown 24971 1727096431.16061: calling self._execute() 24971 1727096431.16133: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.16137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.16147: variable 'omit' from source: magic vars 24971 1727096431.16407: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.16417: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.16424: variable 'omit' from source: magic vars 24971 1727096431.16459: variable 'omit' from source: magic vars 24971 1727096431.16529: variable 'profile' from source: include params 24971 1727096431.16533: variable 'interface' from source: play vars 24971 1727096431.16586: variable 'interface' from source: play vars 24971 1727096431.16601: variable 'omit' from source: magic vars 24971 1727096431.16633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.16661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.16681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.16694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.16703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.16727: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.16730: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.16732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.16800: Set connection var ansible_shell_type to sh 24971 1727096431.16808: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.16817: Set connection var ansible_timeout to 10 24971 1727096431.16821: Set connection var ansible_connection to ssh 24971 1727096431.16826: Set connection var ansible_pipelining to False 24971 1727096431.16831: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.16847: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.16850: variable 'ansible_connection' from source: unknown 24971 1727096431.16852: variable 'ansible_module_compression' from source: unknown 24971 1727096431.16855: variable 'ansible_shell_type' from source: unknown 24971 1727096431.16857: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.16859: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.16863: variable 'ansible_pipelining' from source: unknown 24971 1727096431.16866: variable 'ansible_timeout' from source: unknown 24971 1727096431.16873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.16995: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.17003: variable 'omit' from source: magic vars 24971 1727096431.17006: starting attempt loop 24971 1727096431.17009: running the handler 24971 1727096431.17018: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.17034: _low_level_execute_command(): starting 24971 1727096431.17041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096431.17530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.17572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096431.17575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.17579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.17581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.17626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.17629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.17631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.17677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.19322: stdout chunk (state=3): >>>/root <<< 24971 1727096431.19488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.19494: stdout chunk (state=3): >>><<< 24971 1727096431.19497: stderr chunk (state=3): >>><<< 24971 1727096431.19524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096431.19544: _low_level_execute_command(): starting 24971 1727096431.19627: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353 `" && echo ansible-tmp-1727096431.1953127-25767-123424584307353="` echo /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353 `" ) && sleep 0' 24971 1727096431.20273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096431.20277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096431.20279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096431.20282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096431.20284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.20337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.20345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.20392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.22297: stdout chunk (state=3): >>>ansible-tmp-1727096431.1953127-25767-123424584307353=/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353 <<< 24971 1727096431.22517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.22520: stdout chunk (state=3): >>><<< 24971 1727096431.22522: stderr chunk (state=3): >>><<< 24971 1727096431.22539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096431.1953127-25767-123424584307353=/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096431.22777: variable 'ansible_module_compression' from source: unknown 24971 1727096431.22780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096431.22783: variable 'ansible_facts' from source: unknown 24971 1727096431.22863: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py 24971 1727096431.23008: Sending initial data 24971 1727096431.23115: Sent initial data (156 bytes) 24971 1727096431.23639: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096431.23689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096431.23779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.23802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.23820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.23880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.25458: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096431.25490: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096431.25553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096431.25600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkhx6nar2 /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py <<< 24971 1727096431.25604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py" <<< 24971 1727096431.25675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkhx6nar2" to remote "/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py" <<< 24971 1727096431.26596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.26673: stderr chunk (state=3): >>><<< 24971 1727096431.26676: stdout chunk (state=3): >>><<< 24971 1727096431.26686: done transferring module to remote 24971 1727096431.26700: _low_level_execute_command(): starting 24971 1727096431.26709: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/ /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py && sleep 0' 24971 1727096431.27349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096431.27452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.27484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.27501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.27519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.27582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.29417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.29427: stdout chunk (state=3): >>><<< 24971 1727096431.29442: stderr chunk (state=3): >>><<< 24971 1727096431.29463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096431.29475: _low_level_execute_command(): starting 24971 1727096431.29484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/AnsiballZ_command.py && sleep 0' 24971 1727096431.30102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096431.30120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096431.30181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.30243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.30262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.30284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.30362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.47497: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 09:00:31.454361", "end": "2024-09-23 09:00:31.471760", "delta": "0:00:00.017399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096431.49077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096431.49081: stdout chunk (state=3): >>><<< 24971 1727096431.49087: stderr chunk (state=3): >>><<< 24971 1727096431.49104: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 09:00:31.454361", "end": "2024-09-23 09:00:31.471760", "delta": "0:00:00.017399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096431.49132: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096431.49139: _low_level_execute_command(): starting 24971 1727096431.49144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096431.1953127-25767-123424584307353/ > /dev/null 2>&1 && sleep 0' 24971 1727096431.49559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096431.49563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096431.49595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096431.49603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096431.49606: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.49608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.49660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096431.49664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.49677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.49700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096431.51531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096431.51534: stderr chunk (state=3): >>><<< 24971 1727096431.51537: stdout chunk (state=3): >>><<< 24971 1727096431.51554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096431.51569: handler run complete 24971 1727096431.51587: Evaluated conditional (False): False 24971 1727096431.51595: attempt loop complete, returning result 24971 1727096431.51598: _execute() done 24971 1727096431.51600: dumping result to json 24971 1727096431.51605: done dumping result, returning 24971 1727096431.51614: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-3482-6844-0000000004b3] 24971 1727096431.51618: sending task result for task 0afff68d-5257-3482-6844-0000000004b3 24971 1727096431.51711: done sending task result for task 0afff68d-5257-3482-6844-0000000004b3 24971 1727096431.51714: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.017399", "end": "2024-09-23 09:00:31.471760", "rc": 0, "start": "2024-09-23 09:00:31.454361" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 24971 1727096431.51783: no more pending results, returning what we have 24971 1727096431.51787: results queue empty 24971 1727096431.51788: checking for any_errors_fatal 24971 1727096431.51794: done checking for any_errors_fatal 24971 1727096431.51795: checking for max_fail_percentage 24971 1727096431.51797: done checking for max_fail_percentage 24971 1727096431.51797: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.51798: done checking to see if all hosts have failed 24971 1727096431.51799: getting the remaining hosts for this loop 24971 1727096431.51800: done getting the remaining hosts for this loop 24971 1727096431.51804: getting the next task for host managed_node3 24971 1727096431.51810: done getting next task for host managed_node3 24971 1727096431.51812: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24971 1727096431.51817: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.51821: getting variables 24971 1727096431.51822: in VariableManager get_vars() 24971 1727096431.51861: Calling all_inventory to load vars for managed_node3 24971 1727096431.51864: Calling groups_inventory to load vars for managed_node3 24971 1727096431.51866: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.51879: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.51882: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.51885: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.52730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.55458: done with get_vars() 24971 1727096431.55545: done getting variables 24971 1727096431.55639: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:00:31 -0400 (0:00:00.401) 0:00:19.034 ****** 24971 1727096431.55679: entering _queue_task() for managed_node3/set_fact 24971 1727096431.56190: worker is 1 (out of 1 available) 24971 1727096431.56199: exiting _queue_task() for managed_node3/set_fact 24971 1727096431.56209: done queuing things up, now waiting for results queue to drain 24971 1727096431.56210: waiting for pending results... 24971 1727096431.56346: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24971 1727096431.56439: in run() - task 0afff68d-5257-3482-6844-0000000004b4 24971 1727096431.56458: variable 'ansible_search_path' from source: unknown 24971 1727096431.56523: variable 'ansible_search_path' from source: unknown 24971 1727096431.56528: calling self._execute() 24971 1727096431.56619: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.56639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.56669: variable 'omit' from source: magic vars 24971 1727096431.57076: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.57229: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.57270: variable 'nm_profile_exists' from source: set_fact 24971 1727096431.57293: Evaluated conditional (nm_profile_exists.rc == 0): True 24971 1727096431.57310: variable 'omit' from source: magic vars 24971 1727096431.57360: variable 'omit' from source: magic vars 24971 1727096431.57401: variable 'omit' from source: magic vars 24971 1727096431.57488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.57561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.57629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.57822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.57826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.57829: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.57832: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.57854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.57978: Set connection var ansible_shell_type to sh 24971 1727096431.57994: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.58012: Set connection var ansible_timeout to 10 24971 1727096431.58024: Set connection var ansible_connection to ssh 24971 1727096431.58035: Set connection var ansible_pipelining to False 24971 1727096431.58046: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.58079: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.58172: variable 'ansible_connection' from source: unknown 24971 1727096431.58178: variable 'ansible_module_compression' from source: unknown 24971 1727096431.58180: variable 'ansible_shell_type' from source: unknown 24971 1727096431.58183: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.58185: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.58187: variable 'ansible_pipelining' from source: unknown 24971 1727096431.58189: variable 'ansible_timeout' from source: unknown 24971 1727096431.58191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.58308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.58312: variable 'omit' from source: magic vars 24971 1727096431.58314: starting attempt loop 24971 1727096431.58317: running the handler 24971 1727096431.58319: handler run complete 24971 1727096431.58331: attempt loop complete, returning result 24971 1727096431.58337: _execute() done 24971 1727096431.58343: dumping result to json 24971 1727096431.58349: done dumping result, returning 24971 1727096431.58378: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-3482-6844-0000000004b4] 24971 1727096431.58382: sending task result for task 0afff68d-5257-3482-6844-0000000004b4 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 24971 1727096431.58636: no more pending results, returning what we have 24971 1727096431.58639: results queue empty 24971 1727096431.58640: checking for any_errors_fatal 24971 1727096431.58649: done checking for any_errors_fatal 24971 1727096431.58650: checking for max_fail_percentage 24971 1727096431.58652: done checking for max_fail_percentage 24971 1727096431.58653: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.58654: done checking to see if all hosts have failed 24971 1727096431.58655: getting the remaining hosts for this loop 24971 1727096431.58656: done getting the remaining hosts for this loop 24971 1727096431.58660: getting the next task for host managed_node3 24971 1727096431.58671: done getting next task for host managed_node3 24971 1727096431.58674: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 24971 1727096431.58679: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.58683: getting variables 24971 1727096431.58684: in VariableManager get_vars() 24971 1727096431.58725: Calling all_inventory to load vars for managed_node3 24971 1727096431.58729: Calling groups_inventory to load vars for managed_node3 24971 1727096431.58731: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.58743: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.58746: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.58748: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.59337: done sending task result for task 0afff68d-5257-3482-6844-0000000004b4 24971 1727096431.59341: WORKER PROCESS EXITING 24971 1727096431.61206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.63162: done with get_vars() 24971 1727096431.63184: done getting variables 24971 1727096431.63237: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.63351: variable 'profile' from source: include params 24971 1727096431.63355: variable 'interface' from source: play vars 24971 1727096431.63420: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:00:31 -0400 (0:00:00.077) 0:00:19.112 ****** 24971 1727096431.63456: entering _queue_task() for managed_node3/command 24971 1727096431.63898: worker is 1 (out of 1 available) 24971 1727096431.63908: exiting _queue_task() for managed_node3/command 24971 1727096431.63918: done queuing things up, now waiting for results queue to drain 24971 1727096431.63919: waiting for pending results... 24971 1727096431.63993: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 24971 1727096431.64108: in run() - task 0afff68d-5257-3482-6844-0000000004b6 24971 1727096431.64127: variable 'ansible_search_path' from source: unknown 24971 1727096431.64134: variable 'ansible_search_path' from source: unknown 24971 1727096431.64374: calling self._execute() 24971 1727096431.64377: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.64380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.64382: variable 'omit' from source: magic vars 24971 1727096431.64611: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.64627: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.64747: variable 'profile_stat' from source: set_fact 24971 1727096431.64765: Evaluated conditional (profile_stat.stat.exists): False 24971 1727096431.64776: when evaluation is False, skipping this task 24971 1727096431.64785: _execute() done 24971 1727096431.64792: dumping result to json 24971 1727096431.64799: done dumping result, returning 24971 1727096431.64808: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-3482-6844-0000000004b6] 24971 1727096431.64818: sending task result for task 0afff68d-5257-3482-6844-0000000004b6 24971 1727096431.64912: done sending task result for task 0afff68d-5257-3482-6844-0000000004b6 24971 1727096431.64921: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24971 1727096431.64979: no more pending results, returning what we have 24971 1727096431.64982: results queue empty 24971 1727096431.64983: checking for any_errors_fatal 24971 1727096431.64990: done checking for any_errors_fatal 24971 1727096431.64991: checking for max_fail_percentage 24971 1727096431.64993: done checking for max_fail_percentage 24971 1727096431.64994: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.64995: done checking to see if all hosts have failed 24971 1727096431.64995: getting the remaining hosts for this loop 24971 1727096431.64997: done getting the remaining hosts for this loop 24971 1727096431.65000: getting the next task for host managed_node3 24971 1727096431.65007: done getting next task for host managed_node3 24971 1727096431.65009: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 24971 1727096431.65014: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.65018: getting variables 24971 1727096431.65020: in VariableManager get_vars() 24971 1727096431.65061: Calling all_inventory to load vars for managed_node3 24971 1727096431.65064: Calling groups_inventory to load vars for managed_node3 24971 1727096431.65069: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.65081: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.65084: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.65087: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.66686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.68160: done with get_vars() 24971 1727096431.68183: done getting variables 24971 1727096431.68234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.68329: variable 'profile' from source: include params 24971 1727096431.68333: variable 'interface' from source: play vars 24971 1727096431.68387: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:00:31 -0400 (0:00:00.049) 0:00:19.161 ****** 24971 1727096431.68415: entering _queue_task() for managed_node3/set_fact 24971 1727096431.68665: worker is 1 (out of 1 available) 24971 1727096431.68876: exiting _queue_task() for managed_node3/set_fact 24971 1727096431.68885: done queuing things up, now waiting for results queue to drain 24971 1727096431.68886: waiting for pending results... 24971 1727096431.68940: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 24971 1727096431.69063: in run() - task 0afff68d-5257-3482-6844-0000000004b7 24971 1727096431.69086: variable 'ansible_search_path' from source: unknown 24971 1727096431.69110: variable 'ansible_search_path' from source: unknown 24971 1727096431.69137: calling self._execute() 24971 1727096431.69273: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.69276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.69279: variable 'omit' from source: magic vars 24971 1727096431.69598: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.69616: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.69738: variable 'profile_stat' from source: set_fact 24971 1727096431.69758: Evaluated conditional (profile_stat.stat.exists): False 24971 1727096431.69777: when evaluation is False, skipping this task 24971 1727096431.69874: _execute() done 24971 1727096431.69877: dumping result to json 24971 1727096431.69879: done dumping result, returning 24971 1727096431.69882: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-3482-6844-0000000004b7] 24971 1727096431.69884: sending task result for task 0afff68d-5257-3482-6844-0000000004b7 24971 1727096431.69944: done sending task result for task 0afff68d-5257-3482-6844-0000000004b7 24971 1727096431.69947: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24971 1727096431.70023: no more pending results, returning what we have 24971 1727096431.70028: results queue empty 24971 1727096431.70029: checking for any_errors_fatal 24971 1727096431.70035: done checking for any_errors_fatal 24971 1727096431.70036: checking for max_fail_percentage 24971 1727096431.70038: done checking for max_fail_percentage 24971 1727096431.70039: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.70040: done checking to see if all hosts have failed 24971 1727096431.70040: getting the remaining hosts for this loop 24971 1727096431.70042: done getting the remaining hosts for this loop 24971 1727096431.70045: getting the next task for host managed_node3 24971 1727096431.70052: done getting next task for host managed_node3 24971 1727096431.70054: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 24971 1727096431.70059: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.70064: getting variables 24971 1727096431.70065: in VariableManager get_vars() 24971 1727096431.70109: Calling all_inventory to load vars for managed_node3 24971 1727096431.70112: Calling groups_inventory to load vars for managed_node3 24971 1727096431.70115: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.70128: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.70132: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.70135: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.71485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.72352: done with get_vars() 24971 1727096431.72370: done getting variables 24971 1727096431.72408: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.72479: variable 'profile' from source: include params 24971 1727096431.72482: variable 'interface' from source: play vars 24971 1727096431.72518: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:00:31 -0400 (0:00:00.041) 0:00:19.203 ****** 24971 1727096431.72539: entering _queue_task() for managed_node3/command 24971 1727096431.72716: worker is 1 (out of 1 available) 24971 1727096431.72728: exiting _queue_task() for managed_node3/command 24971 1727096431.72741: done queuing things up, now waiting for results queue to drain 24971 1727096431.72742: waiting for pending results... 24971 1727096431.72907: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 24971 1727096431.72982: in run() - task 0afff68d-5257-3482-6844-0000000004b8 24971 1727096431.72993: variable 'ansible_search_path' from source: unknown 24971 1727096431.72996: variable 'ansible_search_path' from source: unknown 24971 1727096431.73023: calling self._execute() 24971 1727096431.73096: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.73101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.73109: variable 'omit' from source: magic vars 24971 1727096431.73410: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.73573: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.73578: variable 'profile_stat' from source: set_fact 24971 1727096431.73580: Evaluated conditional (profile_stat.stat.exists): False 24971 1727096431.73582: when evaluation is False, skipping this task 24971 1727096431.73585: _execute() done 24971 1727096431.73587: dumping result to json 24971 1727096431.73589: done dumping result, returning 24971 1727096431.73591: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0afff68d-5257-3482-6844-0000000004b8] 24971 1727096431.73615: sending task result for task 0afff68d-5257-3482-6844-0000000004b8 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24971 1727096431.73755: no more pending results, returning what we have 24971 1727096431.73759: results queue empty 24971 1727096431.73760: checking for any_errors_fatal 24971 1727096431.73769: done checking for any_errors_fatal 24971 1727096431.73770: checking for max_fail_percentage 24971 1727096431.73772: done checking for max_fail_percentage 24971 1727096431.73773: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.73774: done checking to see if all hosts have failed 24971 1727096431.73774: getting the remaining hosts for this loop 24971 1727096431.73776: done getting the remaining hosts for this loop 24971 1727096431.73779: getting the next task for host managed_node3 24971 1727096431.73785: done getting next task for host managed_node3 24971 1727096431.73788: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 24971 1727096431.73793: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.73797: getting variables 24971 1727096431.73798: in VariableManager get_vars() 24971 1727096431.73839: Calling all_inventory to load vars for managed_node3 24971 1727096431.73842: Calling groups_inventory to load vars for managed_node3 24971 1727096431.73845: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.73858: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.73861: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.73864: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.74080: done sending task result for task 0afff68d-5257-3482-6844-0000000004b8 24971 1727096431.74084: WORKER PROCESS EXITING 24971 1727096431.75028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.75891: done with get_vars() 24971 1727096431.75905: done getting variables 24971 1727096431.75947: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.76021: variable 'profile' from source: include params 24971 1727096431.76024: variable 'interface' from source: play vars 24971 1727096431.76063: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:00:31 -0400 (0:00:00.035) 0:00:19.238 ****** 24971 1727096431.76089: entering _queue_task() for managed_node3/set_fact 24971 1727096431.76313: worker is 1 (out of 1 available) 24971 1727096431.76323: exiting _queue_task() for managed_node3/set_fact 24971 1727096431.76336: done queuing things up, now waiting for results queue to drain 24971 1727096431.76337: waiting for pending results... 24971 1727096431.76621: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 24971 1727096431.76752: in run() - task 0afff68d-5257-3482-6844-0000000004b9 24971 1727096431.76793: variable 'ansible_search_path' from source: unknown 24971 1727096431.76799: variable 'ansible_search_path' from source: unknown 24971 1727096431.76974: calling self._execute() 24971 1727096431.76978: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.76981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.76984: variable 'omit' from source: magic vars 24971 1727096431.77608: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.77626: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.77762: variable 'profile_stat' from source: set_fact 24971 1727096431.77785: Evaluated conditional (profile_stat.stat.exists): False 24971 1727096431.77794: when evaluation is False, skipping this task 24971 1727096431.77801: _execute() done 24971 1727096431.77806: dumping result to json 24971 1727096431.77813: done dumping result, returning 24971 1727096431.77823: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0afff68d-5257-3482-6844-0000000004b9] 24971 1727096431.77831: sending task result for task 0afff68d-5257-3482-6844-0000000004b9 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24971 1727096431.78016: no more pending results, returning what we have 24971 1727096431.78022: results queue empty 24971 1727096431.78023: checking for any_errors_fatal 24971 1727096431.78036: done checking for any_errors_fatal 24971 1727096431.78037: checking for max_fail_percentage 24971 1727096431.78039: done checking for max_fail_percentage 24971 1727096431.78040: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.78041: done checking to see if all hosts have failed 24971 1727096431.78042: getting the remaining hosts for this loop 24971 1727096431.78043: done getting the remaining hosts for this loop 24971 1727096431.78046: getting the next task for host managed_node3 24971 1727096431.78054: done getting next task for host managed_node3 24971 1727096431.78056: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 24971 1727096431.78060: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.78065: getting variables 24971 1727096431.78071: in VariableManager get_vars() 24971 1727096431.78111: Calling all_inventory to load vars for managed_node3 24971 1727096431.78113: Calling groups_inventory to load vars for managed_node3 24971 1727096431.78116: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.78129: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.78132: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.78135: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.78775: done sending task result for task 0afff68d-5257-3482-6844-0000000004b9 24971 1727096431.78778: WORKER PROCESS EXITING 24971 1727096431.79171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.80214: done with get_vars() 24971 1727096431.80234: done getting variables 24971 1727096431.80289: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.80403: variable 'profile' from source: include params 24971 1727096431.80409: variable 'interface' from source: play vars 24971 1727096431.80462: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 09:00:31 -0400 (0:00:00.044) 0:00:19.282 ****** 24971 1727096431.80493: entering _queue_task() for managed_node3/assert 24971 1727096431.80734: worker is 1 (out of 1 available) 24971 1727096431.80745: exiting _queue_task() for managed_node3/assert 24971 1727096431.80758: done queuing things up, now waiting for results queue to drain 24971 1727096431.80759: waiting for pending results... 24971 1727096431.81183: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 24971 1727096431.81189: in run() - task 0afff68d-5257-3482-6844-0000000003b9 24971 1727096431.81192: variable 'ansible_search_path' from source: unknown 24971 1727096431.81194: variable 'ansible_search_path' from source: unknown 24971 1727096431.81197: calling self._execute() 24971 1727096431.81289: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.81298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.81315: variable 'omit' from source: magic vars 24971 1727096431.81651: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.81671: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.81684: variable 'omit' from source: magic vars 24971 1727096431.81728: variable 'omit' from source: magic vars 24971 1727096431.81825: variable 'profile' from source: include params 24971 1727096431.81834: variable 'interface' from source: play vars 24971 1727096431.81901: variable 'interface' from source: play vars 24971 1727096431.81924: variable 'omit' from source: magic vars 24971 1727096431.81970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.82007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.82029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.82059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.82176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.82179: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.82182: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.82184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.82223: Set connection var ansible_shell_type to sh 24971 1727096431.82236: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.82250: Set connection var ansible_timeout to 10 24971 1727096431.82299: Set connection var ansible_connection to ssh 24971 1727096431.82302: Set connection var ansible_pipelining to False 24971 1727096431.82305: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.82307: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.82309: variable 'ansible_connection' from source: unknown 24971 1727096431.82311: variable 'ansible_module_compression' from source: unknown 24971 1727096431.82318: variable 'ansible_shell_type' from source: unknown 24971 1727096431.82324: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.82331: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.82343: variable 'ansible_pipelining' from source: unknown 24971 1727096431.82355: variable 'ansible_timeout' from source: unknown 24971 1727096431.82408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.82504: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.82525: variable 'omit' from source: magic vars 24971 1727096431.82534: starting attempt loop 24971 1727096431.82540: running the handler 24971 1727096431.82657: variable 'lsr_net_profile_exists' from source: set_fact 24971 1727096431.82668: Evaluated conditional (lsr_net_profile_exists): True 24971 1727096431.82679: handler run complete 24971 1727096431.82698: attempt loop complete, returning result 24971 1727096431.82732: _execute() done 24971 1727096431.82736: dumping result to json 24971 1727096431.82738: done dumping result, returning 24971 1727096431.82741: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0afff68d-5257-3482-6844-0000000003b9] 24971 1727096431.82743: sending task result for task 0afff68d-5257-3482-6844-0000000003b9 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096431.82886: no more pending results, returning what we have 24971 1727096431.82894: results queue empty 24971 1727096431.82896: checking for any_errors_fatal 24971 1727096431.82903: done checking for any_errors_fatal 24971 1727096431.82904: checking for max_fail_percentage 24971 1727096431.82905: done checking for max_fail_percentage 24971 1727096431.82906: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.82907: done checking to see if all hosts have failed 24971 1727096431.82908: getting the remaining hosts for this loop 24971 1727096431.82909: done getting the remaining hosts for this loop 24971 1727096431.82912: getting the next task for host managed_node3 24971 1727096431.82919: done getting next task for host managed_node3 24971 1727096431.82922: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 24971 1727096431.82972: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.82977: getting variables 24971 1727096431.82979: in VariableManager get_vars() 24971 1727096431.83017: Calling all_inventory to load vars for managed_node3 24971 1727096431.83020: Calling groups_inventory to load vars for managed_node3 24971 1727096431.83022: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.83034: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.83037: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.83040: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.83735: done sending task result for task 0afff68d-5257-3482-6844-0000000003b9 24971 1727096431.83738: WORKER PROCESS EXITING 24971 1727096431.84596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.86133: done with get_vars() 24971 1727096431.86152: done getting variables 24971 1727096431.86206: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.86308: variable 'profile' from source: include params 24971 1727096431.86312: variable 'interface' from source: play vars 24971 1727096431.86366: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 09:00:31 -0400 (0:00:00.059) 0:00:19.341 ****** 24971 1727096431.86401: entering _queue_task() for managed_node3/assert 24971 1727096431.86639: worker is 1 (out of 1 available) 24971 1727096431.86650: exiting _queue_task() for managed_node3/assert 24971 1727096431.86661: done queuing things up, now waiting for results queue to drain 24971 1727096431.86662: waiting for pending results... 24971 1727096431.86926: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 24971 1727096431.87178: in run() - task 0afff68d-5257-3482-6844-0000000003ba 24971 1727096431.87182: variable 'ansible_search_path' from source: unknown 24971 1727096431.87185: variable 'ansible_search_path' from source: unknown 24971 1727096431.87376: calling self._execute() 24971 1727096431.87675: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.87679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.87683: variable 'omit' from source: magic vars 24971 1727096431.88082: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.88101: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.88121: variable 'omit' from source: magic vars 24971 1727096431.88181: variable 'omit' from source: magic vars 24971 1727096431.88293: variable 'profile' from source: include params 24971 1727096431.88304: variable 'interface' from source: play vars 24971 1727096431.88378: variable 'interface' from source: play vars 24971 1727096431.88403: variable 'omit' from source: magic vars 24971 1727096431.88451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.88495: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.88520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.88548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.88565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.88605: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.88614: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.88622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.88730: Set connection var ansible_shell_type to sh 24971 1727096431.88743: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.88754: Set connection var ansible_timeout to 10 24971 1727096431.89177: Set connection var ansible_connection to ssh 24971 1727096431.89181: Set connection var ansible_pipelining to False 24971 1727096431.89183: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.89185: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.89187: variable 'ansible_connection' from source: unknown 24971 1727096431.89189: variable 'ansible_module_compression' from source: unknown 24971 1727096431.89191: variable 'ansible_shell_type' from source: unknown 24971 1727096431.89193: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.89195: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.89197: variable 'ansible_pipelining' from source: unknown 24971 1727096431.89199: variable 'ansible_timeout' from source: unknown 24971 1727096431.89201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.89207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.89210: variable 'omit' from source: magic vars 24971 1727096431.89212: starting attempt loop 24971 1727096431.89218: running the handler 24971 1727096431.89335: variable 'lsr_net_profile_ansible_managed' from source: set_fact 24971 1727096431.89346: Evaluated conditional (lsr_net_profile_ansible_managed): True 24971 1727096431.89358: handler run complete 24971 1727096431.89385: attempt loop complete, returning result 24971 1727096431.89399: _execute() done 24971 1727096431.89407: dumping result to json 24971 1727096431.89415: done dumping result, returning 24971 1727096431.89428: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0afff68d-5257-3482-6844-0000000003ba] 24971 1727096431.89436: sending task result for task 0afff68d-5257-3482-6844-0000000003ba ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096431.89575: no more pending results, returning what we have 24971 1727096431.89578: results queue empty 24971 1727096431.89580: checking for any_errors_fatal 24971 1727096431.89585: done checking for any_errors_fatal 24971 1727096431.89586: checking for max_fail_percentage 24971 1727096431.89588: done checking for max_fail_percentage 24971 1727096431.89588: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.89589: done checking to see if all hosts have failed 24971 1727096431.89590: getting the remaining hosts for this loop 24971 1727096431.89591: done getting the remaining hosts for this loop 24971 1727096431.89594: getting the next task for host managed_node3 24971 1727096431.89600: done getting next task for host managed_node3 24971 1727096431.89602: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 24971 1727096431.89605: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.89609: getting variables 24971 1727096431.89610: in VariableManager get_vars() 24971 1727096431.89648: Calling all_inventory to load vars for managed_node3 24971 1727096431.89650: Calling groups_inventory to load vars for managed_node3 24971 1727096431.89652: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.89664: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.89669: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.89672: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.90192: done sending task result for task 0afff68d-5257-3482-6844-0000000003ba 24971 1727096431.90195: WORKER PROCESS EXITING 24971 1727096431.91045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.92805: done with get_vars() 24971 1727096431.92828: done getting variables 24971 1727096431.92898: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096431.93013: variable 'profile' from source: include params 24971 1727096431.93017: variable 'interface' from source: play vars 24971 1727096431.93084: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 09:00:31 -0400 (0:00:00.067) 0:00:19.408 ****** 24971 1727096431.93119: entering _queue_task() for managed_node3/assert 24971 1727096431.93438: worker is 1 (out of 1 available) 24971 1727096431.93451: exiting _queue_task() for managed_node3/assert 24971 1727096431.93464: done queuing things up, now waiting for results queue to drain 24971 1727096431.93465: waiting for pending results... 24971 1727096431.93696: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 24971 1727096431.93834: in run() - task 0afff68d-5257-3482-6844-0000000003bb 24971 1727096431.93860: variable 'ansible_search_path' from source: unknown 24971 1727096431.93872: variable 'ansible_search_path' from source: unknown 24971 1727096431.93965: calling self._execute() 24971 1727096431.94044: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.94056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.94086: variable 'omit' from source: magic vars 24971 1727096431.94692: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.94695: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.94698: variable 'omit' from source: magic vars 24971 1727096431.94701: variable 'omit' from source: magic vars 24971 1727096431.94735: variable 'profile' from source: include params 24971 1727096431.94744: variable 'interface' from source: play vars 24971 1727096431.94814: variable 'interface' from source: play vars 24971 1727096431.94843: variable 'omit' from source: magic vars 24971 1727096431.94887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.94939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.94964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.95040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.95043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.95047: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.95058: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.95065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.95163: Set connection var ansible_shell_type to sh 24971 1727096431.95258: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.95262: Set connection var ansible_timeout to 10 24971 1727096431.95266: Set connection var ansible_connection to ssh 24971 1727096431.95269: Set connection var ansible_pipelining to False 24971 1727096431.95271: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.95273: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.95275: variable 'ansible_connection' from source: unknown 24971 1727096431.95277: variable 'ansible_module_compression' from source: unknown 24971 1727096431.95279: variable 'ansible_shell_type' from source: unknown 24971 1727096431.95281: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.95282: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.95284: variable 'ansible_pipelining' from source: unknown 24971 1727096431.95286: variable 'ansible_timeout' from source: unknown 24971 1727096431.95288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.95445: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.95480: variable 'omit' from source: magic vars 24971 1727096431.95506: starting attempt loop 24971 1727096431.95513: running the handler 24971 1727096431.95602: variable 'lsr_net_profile_fingerprint' from source: set_fact 24971 1727096431.95605: Evaluated conditional (lsr_net_profile_fingerprint): True 24971 1727096431.95611: handler run complete 24971 1727096431.95624: attempt loop complete, returning result 24971 1727096431.95627: _execute() done 24971 1727096431.95630: dumping result to json 24971 1727096431.95632: done dumping result, returning 24971 1727096431.95639: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0afff68d-5257-3482-6844-0000000003bb] 24971 1727096431.95643: sending task result for task 0afff68d-5257-3482-6844-0000000003bb 24971 1727096431.95717: done sending task result for task 0afff68d-5257-3482-6844-0000000003bb 24971 1727096431.95720: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096431.95775: no more pending results, returning what we have 24971 1727096431.95779: results queue empty 24971 1727096431.95780: checking for any_errors_fatal 24971 1727096431.95785: done checking for any_errors_fatal 24971 1727096431.95785: checking for max_fail_percentage 24971 1727096431.95787: done checking for max_fail_percentage 24971 1727096431.95788: checking to see if all hosts have failed and the running result is not ok 24971 1727096431.95789: done checking to see if all hosts have failed 24971 1727096431.95790: getting the remaining hosts for this loop 24971 1727096431.95791: done getting the remaining hosts for this loop 24971 1727096431.95794: getting the next task for host managed_node3 24971 1727096431.95802: done getting next task for host managed_node3 24971 1727096431.95804: ^ task is: TASK: Get ip address information 24971 1727096431.95806: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096431.95811: getting variables 24971 1727096431.95812: in VariableManager get_vars() 24971 1727096431.95852: Calling all_inventory to load vars for managed_node3 24971 1727096431.95855: Calling groups_inventory to load vars for managed_node3 24971 1727096431.95857: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096431.95867: Calling all_plugins_play to load vars for managed_node3 24971 1727096431.95873: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096431.95876: Calling groups_plugins_play to load vars for managed_node3 24971 1727096431.96630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096431.97598: done with get_vars() 24971 1727096431.97619: done getting variables 24971 1727096431.97677: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Monday 23 September 2024 09:00:31 -0400 (0:00:00.045) 0:00:19.454 ****** 24971 1727096431.97705: entering _queue_task() for managed_node3/command 24971 1727096431.97962: worker is 1 (out of 1 available) 24971 1727096431.98177: exiting _queue_task() for managed_node3/command 24971 1727096431.98187: done queuing things up, now waiting for results queue to drain 24971 1727096431.98188: waiting for pending results... 24971 1727096431.98303: running TaskExecutor() for managed_node3/TASK: Get ip address information 24971 1727096431.98373: in run() - task 0afff68d-5257-3482-6844-00000000005e 24971 1727096431.98378: variable 'ansible_search_path' from source: unknown 24971 1727096431.98406: calling self._execute() 24971 1727096431.98480: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.98484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.98493: variable 'omit' from source: magic vars 24971 1727096431.98742: variable 'ansible_distribution_major_version' from source: facts 24971 1727096431.98752: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096431.98758: variable 'omit' from source: magic vars 24971 1727096431.98779: variable 'omit' from source: magic vars 24971 1727096431.98842: variable 'interface' from source: play vars 24971 1727096431.98858: variable 'omit' from source: magic vars 24971 1727096431.98892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096431.98920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096431.98935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096431.98947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.98958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096431.98983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096431.98987: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.98990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.99053: Set connection var ansible_shell_type to sh 24971 1727096431.99059: Set connection var ansible_shell_executable to /bin/sh 24971 1727096431.99071: Set connection var ansible_timeout to 10 24971 1727096431.99074: Set connection var ansible_connection to ssh 24971 1727096431.99079: Set connection var ansible_pipelining to False 24971 1727096431.99084: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096431.99102: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.99105: variable 'ansible_connection' from source: unknown 24971 1727096431.99107: variable 'ansible_module_compression' from source: unknown 24971 1727096431.99110: variable 'ansible_shell_type' from source: unknown 24971 1727096431.99112: variable 'ansible_shell_executable' from source: unknown 24971 1727096431.99114: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096431.99116: variable 'ansible_pipelining' from source: unknown 24971 1727096431.99118: variable 'ansible_timeout' from source: unknown 24971 1727096431.99126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096431.99218: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096431.99228: variable 'omit' from source: magic vars 24971 1727096431.99231: starting attempt loop 24971 1727096431.99234: running the handler 24971 1727096431.99249: _low_level_execute_command(): starting 24971 1727096431.99256: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096431.99731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.99735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.99738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096431.99740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096431.99794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096431.99799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096431.99840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.01478: stdout chunk (state=3): >>>/root <<< 24971 1727096432.01583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.01612: stderr chunk (state=3): >>><<< 24971 1727096432.01615: stdout chunk (state=3): >>><<< 24971 1727096432.01628: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.01665: _low_level_execute_command(): starting 24971 1727096432.01671: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485 `" && echo ansible-tmp-1727096432.0163193-25814-94718347126485="` echo /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485 `" ) && sleep 0' 24971 1727096432.02055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.02058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.02060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096432.02074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096432.02077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.02124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.02127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.02157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.04092: stdout chunk (state=3): >>>ansible-tmp-1727096432.0163193-25814-94718347126485=/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485 <<< 24971 1727096432.04183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.04186: stdout chunk (state=3): >>><<< 24971 1727096432.04188: stderr chunk (state=3): >>><<< 24971 1727096432.04377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096432.0163193-25814-94718347126485=/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.04380: variable 'ansible_module_compression' from source: unknown 24971 1727096432.04383: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096432.04385: variable 'ansible_facts' from source: unknown 24971 1727096432.04425: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py 24971 1727096432.04578: Sending initial data 24971 1727096432.04588: Sent initial data (155 bytes) 24971 1727096432.04990: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.05002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.05014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.05061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.05077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.05108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.06671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096432.06716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096432.06760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp17vc9ptf /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py <<< 24971 1727096432.06764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py" <<< 24971 1727096432.06818: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp17vc9ptf" to remote "/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py" <<< 24971 1727096432.07457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.07502: stderr chunk (state=3): >>><<< 24971 1727096432.07504: stdout chunk (state=3): >>><<< 24971 1727096432.07578: done transferring module to remote 24971 1727096432.07582: _low_level_execute_command(): starting 24971 1727096432.07585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/ /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py && sleep 0' 24971 1727096432.07959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.07963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.07965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.07969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096432.07971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.08020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.08027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.08056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.09784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.09803: stderr chunk (state=3): >>><<< 24971 1727096432.09807: stdout chunk (state=3): >>><<< 24971 1727096432.09818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.09821: _low_level_execute_command(): starting 24971 1727096432.09826: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/AnsiballZ_command.py && sleep 0' 24971 1727096432.10217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.10223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.10240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.10289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.10292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.10332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.25868: stdout chunk (state=3): >>> {"changed": true, "stdout": "34: veth0@if33: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 8a:4c:81:76:89:bd brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::884c:81ff:fe76:89bd/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-23 09:00:32.251677", "end": "2024-09-23 09:00:32.255569", "delta": "0:00:00.003892", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096432.27393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096432.27406: stderr chunk (state=3): >>><<< 24971 1727096432.27409: stdout chunk (state=3): >>><<< 24971 1727096432.27425: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "34: veth0@if33: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 8a:4c:81:76:89:bd brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::884c:81ff:fe76:89bd/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-23 09:00:32.251677", "end": "2024-09-23 09:00:32.255569", "delta": "0:00:00.003892", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096432.27460: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096432.27471: _low_level_execute_command(): starting 24971 1727096432.27477: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096432.0163193-25814-94718347126485/ > /dev/null 2>&1 && sleep 0' 24971 1727096432.27917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.27921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.27923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.27926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096432.27929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.27975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.27981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.28020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.30075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.30078: stdout chunk (state=3): >>><<< 24971 1727096432.30081: stderr chunk (state=3): >>><<< 24971 1727096432.30083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.30085: handler run complete 24971 1727096432.30087: Evaluated conditional (False): False 24971 1727096432.30089: attempt loop complete, returning result 24971 1727096432.30091: _execute() done 24971 1727096432.30093: dumping result to json 24971 1727096432.30095: done dumping result, returning 24971 1727096432.30097: done running TaskExecutor() for managed_node3/TASK: Get ip address information [0afff68d-5257-3482-6844-00000000005e] 24971 1727096432.30099: sending task result for task 0afff68d-5257-3482-6844-00000000005e 24971 1727096432.30172: done sending task result for task 0afff68d-5257-3482-6844-00000000005e 24971 1727096432.30176: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003892", "end": "2024-09-23 09:00:32.255569", "rc": 0, "start": "2024-09-23 09:00:32.251677" } STDOUT: 34: veth0@if33: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 8a:4c:81:76:89:bd brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::884c:81ff:fe76:89bd/64 scope link noprefixroute valid_lft forever preferred_lft forever 24971 1727096432.30260: no more pending results, returning what we have 24971 1727096432.30264: results queue empty 24971 1727096432.30265: checking for any_errors_fatal 24971 1727096432.30273: done checking for any_errors_fatal 24971 1727096432.30274: checking for max_fail_percentage 24971 1727096432.30276: done checking for max_fail_percentage 24971 1727096432.30277: checking to see if all hosts have failed and the running result is not ok 24971 1727096432.30278: done checking to see if all hosts have failed 24971 1727096432.30279: getting the remaining hosts for this loop 24971 1727096432.30280: done getting the remaining hosts for this loop 24971 1727096432.30284: getting the next task for host managed_node3 24971 1727096432.30292: done getting next task for host managed_node3 24971 1727096432.30294: ^ task is: TASK: Show ip_addr 24971 1727096432.30296: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096432.30300: getting variables 24971 1727096432.30302: in VariableManager get_vars() 24971 1727096432.30343: Calling all_inventory to load vars for managed_node3 24971 1727096432.30346: Calling groups_inventory to load vars for managed_node3 24971 1727096432.30348: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096432.30360: Calling all_plugins_play to load vars for managed_node3 24971 1727096432.30363: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096432.30366: Calling groups_plugins_play to load vars for managed_node3 24971 1727096432.36569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096432.38148: done with get_vars() 24971 1727096432.38171: done getting variables 24971 1727096432.38222: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Monday 23 September 2024 09:00:32 -0400 (0:00:00.405) 0:00:19.860 ****** 24971 1727096432.38246: entering _queue_task() for managed_node3/debug 24971 1727096432.38784: worker is 1 (out of 1 available) 24971 1727096432.38793: exiting _queue_task() for managed_node3/debug 24971 1727096432.38803: done queuing things up, now waiting for results queue to drain 24971 1727096432.38804: waiting for pending results... 24971 1727096432.38882: running TaskExecutor() for managed_node3/TASK: Show ip_addr 24971 1727096432.39033: in run() - task 0afff68d-5257-3482-6844-00000000005f 24971 1727096432.39037: variable 'ansible_search_path' from source: unknown 24971 1727096432.39053: calling self._execute() 24971 1727096432.39159: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.39172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.39249: variable 'omit' from source: magic vars 24971 1727096432.39673: variable 'ansible_distribution_major_version' from source: facts 24971 1727096432.39875: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096432.39879: variable 'omit' from source: magic vars 24971 1727096432.39881: variable 'omit' from source: magic vars 24971 1727096432.39884: variable 'omit' from source: magic vars 24971 1727096432.39918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096432.40021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096432.40045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096432.40112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.40132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.40207: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096432.40234: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.40245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.40528: Set connection var ansible_shell_type to sh 24971 1727096432.40532: Set connection var ansible_shell_executable to /bin/sh 24971 1727096432.40543: Set connection var ansible_timeout to 10 24971 1727096432.40559: Set connection var ansible_connection to ssh 24971 1727096432.40571: Set connection var ansible_pipelining to False 24971 1727096432.40583: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096432.40610: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.40642: variable 'ansible_connection' from source: unknown 24971 1727096432.40666: variable 'ansible_module_compression' from source: unknown 24971 1727096432.40676: variable 'ansible_shell_type' from source: unknown 24971 1727096432.40789: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.40797: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.40806: variable 'ansible_pipelining' from source: unknown 24971 1727096432.40879: variable 'ansible_timeout' from source: unknown 24971 1727096432.40884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.40950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096432.40975: variable 'omit' from source: magic vars 24971 1727096432.40991: starting attempt loop 24971 1727096432.40998: running the handler 24971 1727096432.41139: variable 'ip_addr' from source: set_fact 24971 1727096432.41163: handler run complete 24971 1727096432.41198: attempt loop complete, returning result 24971 1727096432.41206: _execute() done 24971 1727096432.41213: dumping result to json 24971 1727096432.41220: done dumping result, returning 24971 1727096432.41230: done running TaskExecutor() for managed_node3/TASK: Show ip_addr [0afff68d-5257-3482-6844-00000000005f] 24971 1727096432.41238: sending task result for task 0afff68d-5257-3482-6844-00000000005f ok: [managed_node3] => { "ip_addr.stdout": "34: veth0@if33: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 8a:4c:81:76:89:bd brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::884c:81ff:fe76:89bd/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 24971 1727096432.41427: no more pending results, returning what we have 24971 1727096432.41431: results queue empty 24971 1727096432.41432: checking for any_errors_fatal 24971 1727096432.41442: done checking for any_errors_fatal 24971 1727096432.41442: checking for max_fail_percentage 24971 1727096432.41445: done checking for max_fail_percentage 24971 1727096432.41446: checking to see if all hosts have failed and the running result is not ok 24971 1727096432.41447: done checking to see if all hosts have failed 24971 1727096432.41448: getting the remaining hosts for this loop 24971 1727096432.41449: done getting the remaining hosts for this loop 24971 1727096432.41454: getting the next task for host managed_node3 24971 1727096432.41460: done getting next task for host managed_node3 24971 1727096432.41463: ^ task is: TASK: Assert ipv6 addresses are correctly set 24971 1727096432.41465: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096432.41470: getting variables 24971 1727096432.41472: in VariableManager get_vars() 24971 1727096432.41515: Calling all_inventory to load vars for managed_node3 24971 1727096432.41519: Calling groups_inventory to load vars for managed_node3 24971 1727096432.41521: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096432.41533: Calling all_plugins_play to load vars for managed_node3 24971 1727096432.41536: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096432.41540: Calling groups_plugins_play to load vars for managed_node3 24971 1727096432.42181: done sending task result for task 0afff68d-5257-3482-6844-00000000005f 24971 1727096432.42184: WORKER PROCESS EXITING 24971 1727096432.43165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096432.44842: done with get_vars() 24971 1727096432.44864: done getting variables 24971 1727096432.44927: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Monday 23 September 2024 09:00:32 -0400 (0:00:00.067) 0:00:19.927 ****** 24971 1727096432.44963: entering _queue_task() for managed_node3/assert 24971 1727096432.45240: worker is 1 (out of 1 available) 24971 1727096432.45251: exiting _queue_task() for managed_node3/assert 24971 1727096432.45374: done queuing things up, now waiting for results queue to drain 24971 1727096432.45376: waiting for pending results... 24971 1727096432.45546: running TaskExecutor() for managed_node3/TASK: Assert ipv6 addresses are correctly set 24971 1727096432.45646: in run() - task 0afff68d-5257-3482-6844-000000000060 24971 1727096432.45666: variable 'ansible_search_path' from source: unknown 24971 1727096432.45716: calling self._execute() 24971 1727096432.45829: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.45841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.45859: variable 'omit' from source: magic vars 24971 1727096432.46259: variable 'ansible_distribution_major_version' from source: facts 24971 1727096432.46281: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096432.46292: variable 'omit' from source: magic vars 24971 1727096432.46318: variable 'omit' from source: magic vars 24971 1727096432.46372: variable 'omit' from source: magic vars 24971 1727096432.46414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096432.46451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096432.46484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096432.46506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.46521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.46578: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096432.46582: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.46585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.46678: Set connection var ansible_shell_type to sh 24971 1727096432.46873: Set connection var ansible_shell_executable to /bin/sh 24971 1727096432.46876: Set connection var ansible_timeout to 10 24971 1727096432.46879: Set connection var ansible_connection to ssh 24971 1727096432.46881: Set connection var ansible_pipelining to False 24971 1727096432.46883: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096432.46885: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.46887: variable 'ansible_connection' from source: unknown 24971 1727096432.46889: variable 'ansible_module_compression' from source: unknown 24971 1727096432.46892: variable 'ansible_shell_type' from source: unknown 24971 1727096432.46894: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.46896: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.46898: variable 'ansible_pipelining' from source: unknown 24971 1727096432.46900: variable 'ansible_timeout' from source: unknown 24971 1727096432.46902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.46936: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096432.46950: variable 'omit' from source: magic vars 24971 1727096432.46959: starting attempt loop 24971 1727096432.46965: running the handler 24971 1727096432.47131: variable 'ip_addr' from source: set_fact 24971 1727096432.47151: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 24971 1727096432.47279: variable 'ip_addr' from source: set_fact 24971 1727096432.47296: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 24971 1727096432.47420: variable 'ip_addr' from source: set_fact 24971 1727096432.47457: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 24971 1727096432.47461: handler run complete 24971 1727096432.47476: attempt loop complete, returning result 24971 1727096432.47484: _execute() done 24971 1727096432.47567: dumping result to json 24971 1727096432.47572: done dumping result, returning 24971 1727096432.47576: done running TaskExecutor() for managed_node3/TASK: Assert ipv6 addresses are correctly set [0afff68d-5257-3482-6844-000000000060] 24971 1727096432.47578: sending task result for task 0afff68d-5257-3482-6844-000000000060 24971 1727096432.47640: done sending task result for task 0afff68d-5257-3482-6844-000000000060 24971 1727096432.47643: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096432.47713: no more pending results, returning what we have 24971 1727096432.47716: results queue empty 24971 1727096432.47717: checking for any_errors_fatal 24971 1727096432.47722: done checking for any_errors_fatal 24971 1727096432.47722: checking for max_fail_percentage 24971 1727096432.47725: done checking for max_fail_percentage 24971 1727096432.47726: checking to see if all hosts have failed and the running result is not ok 24971 1727096432.47727: done checking to see if all hosts have failed 24971 1727096432.47727: getting the remaining hosts for this loop 24971 1727096432.47729: done getting the remaining hosts for this loop 24971 1727096432.47732: getting the next task for host managed_node3 24971 1727096432.47738: done getting next task for host managed_node3 24971 1727096432.47740: ^ task is: TASK: Get ipv6 routes 24971 1727096432.47742: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096432.47745: getting variables 24971 1727096432.47747: in VariableManager get_vars() 24971 1727096432.47791: Calling all_inventory to load vars for managed_node3 24971 1727096432.47794: Calling groups_inventory to load vars for managed_node3 24971 1727096432.47796: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096432.47809: Calling all_plugins_play to load vars for managed_node3 24971 1727096432.47812: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096432.47815: Calling groups_plugins_play to load vars for managed_node3 24971 1727096432.49438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096432.51997: done with get_vars() 24971 1727096432.52025: done getting variables 24971 1727096432.52197: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Monday 23 September 2024 09:00:32 -0400 (0:00:00.072) 0:00:20.000 ****** 24971 1727096432.52261: entering _queue_task() for managed_node3/command 24971 1727096432.53104: worker is 1 (out of 1 available) 24971 1727096432.53116: exiting _queue_task() for managed_node3/command 24971 1727096432.53130: done queuing things up, now waiting for results queue to drain 24971 1727096432.53131: waiting for pending results... 24971 1727096432.53899: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 24971 1727096432.53903: in run() - task 0afff68d-5257-3482-6844-000000000061 24971 1727096432.53906: variable 'ansible_search_path' from source: unknown 24971 1727096432.54376: calling self._execute() 24971 1727096432.54380: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.54383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.54385: variable 'omit' from source: magic vars 24971 1727096432.55059: variable 'ansible_distribution_major_version' from source: facts 24971 1727096432.55080: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096432.55091: variable 'omit' from source: magic vars 24971 1727096432.55113: variable 'omit' from source: magic vars 24971 1727096432.55153: variable 'omit' from source: magic vars 24971 1727096432.55575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096432.55579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096432.55582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096432.55585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.55589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096432.55592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096432.55595: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.55598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.55651: Set connection var ansible_shell_type to sh 24971 1727096432.56075: Set connection var ansible_shell_executable to /bin/sh 24971 1727096432.56078: Set connection var ansible_timeout to 10 24971 1727096432.56081: Set connection var ansible_connection to ssh 24971 1727096432.56083: Set connection var ansible_pipelining to False 24971 1727096432.56085: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096432.56087: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.56089: variable 'ansible_connection' from source: unknown 24971 1727096432.56091: variable 'ansible_module_compression' from source: unknown 24971 1727096432.56094: variable 'ansible_shell_type' from source: unknown 24971 1727096432.56096: variable 'ansible_shell_executable' from source: unknown 24971 1727096432.56098: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096432.56099: variable 'ansible_pipelining' from source: unknown 24971 1727096432.56102: variable 'ansible_timeout' from source: unknown 24971 1727096432.56104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096432.56282: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096432.56299: variable 'omit' from source: magic vars 24971 1727096432.56308: starting attempt loop 24971 1727096432.56315: running the handler 24971 1727096432.56337: _low_level_execute_command(): starting 24971 1727096432.56351: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096432.57818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.57837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.57856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096432.57961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.58025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.58038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.58075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.58195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.59870: stdout chunk (state=3): >>>/root <<< 24971 1727096432.60026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.60142: stderr chunk (state=3): >>><<< 24971 1727096432.60145: stdout chunk (state=3): >>><<< 24971 1727096432.60163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.60187: _low_level_execute_command(): starting 24971 1727096432.60200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081 `" && echo ansible-tmp-1727096432.6017354-25833-117258433269081="` echo /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081 `" ) && sleep 0' 24971 1727096432.61355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.61358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096432.61361: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.61375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.61378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096432.61391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.61424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.61579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.61655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.61808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.63706: stdout chunk (state=3): >>>ansible-tmp-1727096432.6017354-25833-117258433269081=/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081 <<< 24971 1727096432.63997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.64007: stdout chunk (state=3): >>><<< 24971 1727096432.64018: stderr chunk (state=3): >>><<< 24971 1727096432.64040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096432.6017354-25833-117258433269081=/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.64087: variable 'ansible_module_compression' from source: unknown 24971 1727096432.64394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096432.64397: variable 'ansible_facts' from source: unknown 24971 1727096432.64486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py 24971 1727096432.64825: Sending initial data 24971 1727096432.64836: Sent initial data (156 bytes) 24971 1727096432.66575: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096432.66681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.67174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.67205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.68811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096432.68839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096432.68900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py" <<< 24971 1727096432.68910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp7muuv5qd /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py <<< 24971 1727096432.69277: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp7muuv5qd" to remote "/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py" <<< 24971 1727096432.70293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.70297: stdout chunk (state=3): >>><<< 24971 1727096432.70300: stderr chunk (state=3): >>><<< 24971 1727096432.70410: done transferring module to remote 24971 1727096432.70427: _low_level_execute_command(): starting 24971 1727096432.70439: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/ /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py && sleep 0' 24971 1727096432.71705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096432.71834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.71975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.72049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.73876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.74007: stdout chunk (state=3): >>><<< 24971 1727096432.74020: stderr chunk (state=3): >>><<< 24971 1727096432.74040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.74049: _low_level_execute_command(): starting 24971 1727096432.74058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/AnsiballZ_command.py && sleep 0' 24971 1727096432.75343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096432.75358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.75406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096432.75632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.75648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.75748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.75833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.91408: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 09:00:32.905610", "end": "2024-09-23 09:00:32.909393", "delta": "0:00:00.003783", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096432.92825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096432.92829: stdout chunk (state=3): >>><<< 24971 1727096432.92831: stderr chunk (state=3): >>><<< 24971 1727096432.93078: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 09:00:32.905610", "end": "2024-09-23 09:00:32.909393", "delta": "0:00:00.003783", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096432.93082: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096432.93085: _low_level_execute_command(): starting 24971 1727096432.93087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096432.6017354-25833-117258433269081/ > /dev/null 2>&1 && sleep 0' 24971 1727096432.94124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096432.94137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096432.94152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096432.94232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096432.94284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096432.94296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096432.94334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096432.96359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096432.96534: stderr chunk (state=3): >>><<< 24971 1727096432.96543: stdout chunk (state=3): >>><<< 24971 1727096432.96566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096432.96580: handler run complete 24971 1727096432.96973: Evaluated conditional (False): False 24971 1727096432.96976: attempt loop complete, returning result 24971 1727096432.96978: _execute() done 24971 1727096432.96981: dumping result to json 24971 1727096432.96983: done dumping result, returning 24971 1727096432.96985: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0afff68d-5257-3482-6844-000000000061] 24971 1727096432.96986: sending task result for task 0afff68d-5257-3482-6844-000000000061 24971 1727096432.97057: done sending task result for task 0afff68d-5257-3482-6844-000000000061 24971 1727096432.97060: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003783", "end": "2024-09-23 09:00:32.909393", "rc": 0, "start": "2024-09-23 09:00:32.905610" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 24971 1727096432.97146: no more pending results, returning what we have 24971 1727096432.97150: results queue empty 24971 1727096432.97151: checking for any_errors_fatal 24971 1727096432.97158: done checking for any_errors_fatal 24971 1727096432.97159: checking for max_fail_percentage 24971 1727096432.97161: done checking for max_fail_percentage 24971 1727096432.97162: checking to see if all hosts have failed and the running result is not ok 24971 1727096432.97163: done checking to see if all hosts have failed 24971 1727096432.97164: getting the remaining hosts for this loop 24971 1727096432.97165: done getting the remaining hosts for this loop 24971 1727096432.97171: getting the next task for host managed_node3 24971 1727096432.97178: done getting next task for host managed_node3 24971 1727096432.97181: ^ task is: TASK: Show ipv6_route 24971 1727096432.97183: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096432.97187: getting variables 24971 1727096432.97189: in VariableManager get_vars() 24971 1727096432.97230: Calling all_inventory to load vars for managed_node3 24971 1727096432.97232: Calling groups_inventory to load vars for managed_node3 24971 1727096432.97234: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096432.97246: Calling all_plugins_play to load vars for managed_node3 24971 1727096432.97249: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096432.97252: Calling groups_plugins_play to load vars for managed_node3 24971 1727096433.00188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096433.02036: done with get_vars() 24971 1727096433.02057: done getting variables 24971 1727096433.02111: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Monday 23 September 2024 09:00:33 -0400 (0:00:00.498) 0:00:20.499 ****** 24971 1727096433.02137: entering _queue_task() for managed_node3/debug 24971 1727096433.02445: worker is 1 (out of 1 available) 24971 1727096433.02457: exiting _queue_task() for managed_node3/debug 24971 1727096433.02473: done queuing things up, now waiting for results queue to drain 24971 1727096433.02475: waiting for pending results... 24971 1727096433.02770: running TaskExecutor() for managed_node3/TASK: Show ipv6_route 24971 1727096433.02951: in run() - task 0afff68d-5257-3482-6844-000000000062 24971 1727096433.02955: variable 'ansible_search_path' from source: unknown 24971 1727096433.02969: calling self._execute() 24971 1727096433.03086: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.03100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.03115: variable 'omit' from source: magic vars 24971 1727096433.03979: variable 'ansible_distribution_major_version' from source: facts 24971 1727096433.03986: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096433.03989: variable 'omit' from source: magic vars 24971 1727096433.03992: variable 'omit' from source: magic vars 24971 1727096433.04174: variable 'omit' from source: magic vars 24971 1727096433.04177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096433.04227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096433.04278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096433.04360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.04545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.04549: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096433.04552: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.04555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.04734: Set connection var ansible_shell_type to sh 24971 1727096433.04753: Set connection var ansible_shell_executable to /bin/sh 24971 1727096433.04773: Set connection var ansible_timeout to 10 24971 1727096433.04786: Set connection var ansible_connection to ssh 24971 1727096433.04797: Set connection var ansible_pipelining to False 24971 1727096433.04808: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096433.04834: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.04843: variable 'ansible_connection' from source: unknown 24971 1727096433.04850: variable 'ansible_module_compression' from source: unknown 24971 1727096433.04865: variable 'ansible_shell_type' from source: unknown 24971 1727096433.04885: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.04894: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.04903: variable 'ansible_pipelining' from source: unknown 24971 1727096433.04910: variable 'ansible_timeout' from source: unknown 24971 1727096433.04918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.05063: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096433.05085: variable 'omit' from source: magic vars 24971 1727096433.05095: starting attempt loop 24971 1727096433.05103: running the handler 24971 1727096433.05272: variable 'ipv6_route' from source: set_fact 24971 1727096433.05275: handler run complete 24971 1727096433.05277: attempt loop complete, returning result 24971 1727096433.05279: _execute() done 24971 1727096433.05281: dumping result to json 24971 1727096433.05283: done dumping result, returning 24971 1727096433.05290: done running TaskExecutor() for managed_node3/TASK: Show ipv6_route [0afff68d-5257-3482-6844-000000000062] 24971 1727096433.05298: sending task result for task 0afff68d-5257-3482-6844-000000000062 ok: [managed_node3] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 24971 1727096433.05527: no more pending results, returning what we have 24971 1727096433.05531: results queue empty 24971 1727096433.05532: checking for any_errors_fatal 24971 1727096433.05541: done checking for any_errors_fatal 24971 1727096433.05541: checking for max_fail_percentage 24971 1727096433.05543: done checking for max_fail_percentage 24971 1727096433.05544: checking to see if all hosts have failed and the running result is not ok 24971 1727096433.05545: done checking to see if all hosts have failed 24971 1727096433.05546: getting the remaining hosts for this loop 24971 1727096433.05547: done getting the remaining hosts for this loop 24971 1727096433.05551: getting the next task for host managed_node3 24971 1727096433.05556: done getting next task for host managed_node3 24971 1727096433.05559: ^ task is: TASK: Assert default ipv6 route is set 24971 1727096433.05561: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096433.05565: getting variables 24971 1727096433.05566: in VariableManager get_vars() 24971 1727096433.05607: Calling all_inventory to load vars for managed_node3 24971 1727096433.05610: Calling groups_inventory to load vars for managed_node3 24971 1727096433.05612: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096433.05623: Calling all_plugins_play to load vars for managed_node3 24971 1727096433.05626: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096433.05628: Calling groups_plugins_play to load vars for managed_node3 24971 1727096433.06365: done sending task result for task 0afff68d-5257-3482-6844-000000000062 24971 1727096433.06370: WORKER PROCESS EXITING 24971 1727096433.07800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096433.09418: done with get_vars() 24971 1727096433.09440: done getting variables 24971 1727096433.09512: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Monday 23 September 2024 09:00:33 -0400 (0:00:00.074) 0:00:20.573 ****** 24971 1727096433.09546: entering _queue_task() for managed_node3/assert 24971 1727096433.10003: worker is 1 (out of 1 available) 24971 1727096433.10011: exiting _queue_task() for managed_node3/assert 24971 1727096433.10020: done queuing things up, now waiting for results queue to drain 24971 1727096433.10021: waiting for pending results... 24971 1727096433.10156: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is set 24971 1727096433.10255: in run() - task 0afff68d-5257-3482-6844-000000000063 24971 1727096433.10277: variable 'ansible_search_path' from source: unknown 24971 1727096433.10319: calling self._execute() 24971 1727096433.10419: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.10429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.10443: variable 'omit' from source: magic vars 24971 1727096433.10809: variable 'ansible_distribution_major_version' from source: facts 24971 1727096433.10828: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096433.10897: variable 'omit' from source: magic vars 24971 1727096433.10901: variable 'omit' from source: magic vars 24971 1727096433.10903: variable 'omit' from source: magic vars 24971 1727096433.10946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096433.10987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096433.11014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096433.11078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.11094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.11233: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096433.11236: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.11239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.11277: Set connection var ansible_shell_type to sh 24971 1727096433.11336: Set connection var ansible_shell_executable to /bin/sh 24971 1727096433.11352: Set connection var ansible_timeout to 10 24971 1727096433.11361: Set connection var ansible_connection to ssh 24971 1727096433.11408: Set connection var ansible_pipelining to False 24971 1727096433.11418: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096433.11774: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.11777: variable 'ansible_connection' from source: unknown 24971 1727096433.11781: variable 'ansible_module_compression' from source: unknown 24971 1727096433.11784: variable 'ansible_shell_type' from source: unknown 24971 1727096433.11786: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.11788: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.11790: variable 'ansible_pipelining' from source: unknown 24971 1727096433.11792: variable 'ansible_timeout' from source: unknown 24971 1727096433.11794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.11826: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096433.11922: variable 'omit' from source: magic vars 24971 1727096433.11932: starting attempt loop 24971 1727096433.11938: running the handler 24971 1727096433.12190: variable '__test_str' from source: task vars 24971 1727096433.12338: variable 'interface' from source: play vars 24971 1727096433.12405: variable 'ipv6_route' from source: set_fact 24971 1727096433.12424: Evaluated conditional (__test_str in ipv6_route.stdout): True 24971 1727096433.12434: handler run complete 24971 1727096433.12484: attempt loop complete, returning result 24971 1727096433.12491: _execute() done 24971 1727096433.12501: dumping result to json 24971 1727096433.12511: done dumping result, returning 24971 1727096433.12521: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is set [0afff68d-5257-3482-6844-000000000063] 24971 1727096433.12529: sending task result for task 0afff68d-5257-3482-6844-000000000063 24971 1727096433.12666: done sending task result for task 0afff68d-5257-3482-6844-000000000063 24971 1727096433.12671: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24971 1727096433.12759: no more pending results, returning what we have 24971 1727096433.12763: results queue empty 24971 1727096433.12764: checking for any_errors_fatal 24971 1727096433.12774: done checking for any_errors_fatal 24971 1727096433.12775: checking for max_fail_percentage 24971 1727096433.12777: done checking for max_fail_percentage 24971 1727096433.12778: checking to see if all hosts have failed and the running result is not ok 24971 1727096433.12779: done checking to see if all hosts have failed 24971 1727096433.12780: getting the remaining hosts for this loop 24971 1727096433.12781: done getting the remaining hosts for this loop 24971 1727096433.12785: getting the next task for host managed_node3 24971 1727096433.12790: done getting next task for host managed_node3 24971 1727096433.12793: ^ task is: TASK: Ensure ping6 command is present 24971 1727096433.12795: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096433.12798: getting variables 24971 1727096433.12800: in VariableManager get_vars() 24971 1727096433.12841: Calling all_inventory to load vars for managed_node3 24971 1727096433.12843: Calling groups_inventory to load vars for managed_node3 24971 1727096433.12846: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096433.12857: Calling all_plugins_play to load vars for managed_node3 24971 1727096433.12860: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096433.12863: Calling groups_plugins_play to load vars for managed_node3 24971 1727096433.14363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096433.16347: done with get_vars() 24971 1727096433.16372: done getting variables 24971 1727096433.16438: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Monday 23 September 2024 09:00:33 -0400 (0:00:00.069) 0:00:20.642 ****** 24971 1727096433.16466: entering _queue_task() for managed_node3/package 24971 1727096433.16964: worker is 1 (out of 1 available) 24971 1727096433.16979: exiting _queue_task() for managed_node3/package 24971 1727096433.16990: done queuing things up, now waiting for results queue to drain 24971 1727096433.16991: waiting for pending results... 24971 1727096433.17785: running TaskExecutor() for managed_node3/TASK: Ensure ping6 command is present 24971 1727096433.17790: in run() - task 0afff68d-5257-3482-6844-000000000064 24971 1727096433.17794: variable 'ansible_search_path' from source: unknown 24971 1727096433.17797: calling self._execute() 24971 1727096433.18177: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.18181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.18185: variable 'omit' from source: magic vars 24971 1727096433.18592: variable 'ansible_distribution_major_version' from source: facts 24971 1727096433.18610: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096433.18621: variable 'omit' from source: magic vars 24971 1727096433.18648: variable 'omit' from source: magic vars 24971 1727096433.18848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096433.20875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096433.20947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096433.20996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096433.21037: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096433.21074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096433.21166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096433.21204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096433.21232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096433.21284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096433.21305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096433.21410: variable '__network_is_ostree' from source: set_fact 24971 1727096433.21420: variable 'omit' from source: magic vars 24971 1727096433.21456: variable 'omit' from source: magic vars 24971 1727096433.21563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096433.21566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096433.21573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096433.21575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.21578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.21584: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096433.21592: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.21599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.21701: Set connection var ansible_shell_type to sh 24971 1727096433.21713: Set connection var ansible_shell_executable to /bin/sh 24971 1727096433.21727: Set connection var ansible_timeout to 10 24971 1727096433.21735: Set connection var ansible_connection to ssh 24971 1727096433.21744: Set connection var ansible_pipelining to False 24971 1727096433.21751: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096433.21783: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.21791: variable 'ansible_connection' from source: unknown 24971 1727096433.21796: variable 'ansible_module_compression' from source: unknown 24971 1727096433.21801: variable 'ansible_shell_type' from source: unknown 24971 1727096433.21806: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.21811: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.21816: variable 'ansible_pipelining' from source: unknown 24971 1727096433.21821: variable 'ansible_timeout' from source: unknown 24971 1727096433.21827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.21919: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096433.21975: variable 'omit' from source: magic vars 24971 1727096433.21978: starting attempt loop 24971 1727096433.21981: running the handler 24971 1727096433.21983: variable 'ansible_facts' from source: unknown 24971 1727096433.21985: variable 'ansible_facts' from source: unknown 24971 1727096433.22000: _low_level_execute_command(): starting 24971 1727096433.22011: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096433.22665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.22738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.22741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.22743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.22805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.24458: stdout chunk (state=3): >>>/root <<< 24971 1727096433.24609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.24613: stdout chunk (state=3): >>><<< 24971 1727096433.24615: stderr chunk (state=3): >>><<< 24971 1727096433.24632: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.24742: _low_level_execute_command(): starting 24971 1727096433.24746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545 `" && echo ansible-tmp-1727096433.2464557-25867-142825738123545="` echo /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545 `" ) && sleep 0' 24971 1727096433.25274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096433.25290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.25309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.25325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.25344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096433.25385: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.25469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.25488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.25520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.25578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.27488: stdout chunk (state=3): >>>ansible-tmp-1727096433.2464557-25867-142825738123545=/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545 <<< 24971 1727096433.27768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.27772: stdout chunk (state=3): >>><<< 24971 1727096433.27774: stderr chunk (state=3): >>><<< 24971 1727096433.27801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096433.2464557-25867-142825738123545=/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.27815: variable 'ansible_module_compression' from source: unknown 24971 1727096433.27882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 24971 1727096433.27934: variable 'ansible_facts' from source: unknown 24971 1727096433.28128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py 24971 1727096433.28259: Sending initial data 24971 1727096433.28262: Sent initial data (152 bytes) 24971 1727096433.28791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.28805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.28899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.28924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.30463: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096433.30497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096433.30571: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp_3dm23er /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py <<< 24971 1727096433.30574: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py" <<< 24971 1727096433.30617: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp_3dm23er" to remote "/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py" <<< 24971 1727096433.31570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.31692: stderr chunk (state=3): >>><<< 24971 1727096433.31695: stdout chunk (state=3): >>><<< 24971 1727096433.31713: done transferring module to remote 24971 1727096433.31729: _low_level_execute_command(): starting 24971 1727096433.31738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/ /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py && sleep 0' 24971 1727096433.32385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096433.32401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.32460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.32477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.32537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.32550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.32586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.32640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.34431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.34435: stdout chunk (state=3): >>><<< 24971 1727096433.34437: stderr chunk (state=3): >>><<< 24971 1727096433.34452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.34473: _low_level_execute_command(): starting 24971 1727096433.34476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/AnsiballZ_dnf.py && sleep 0' 24971 1727096433.35013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096433.35026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.35041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.35059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.35084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096433.35097: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096433.35110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.35187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.35209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.35234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.35249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.35486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.76608: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24971 1727096433.80982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096433.80986: stdout chunk (state=3): >>><<< 24971 1727096433.80988: stderr chunk (state=3): >>><<< 24971 1727096433.80991: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096433.80997: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096433.81000: _low_level_execute_command(): starting 24971 1727096433.81002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096433.2464557-25867-142825738123545/ > /dev/null 2>&1 && sleep 0' 24971 1727096433.81814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.81840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.81898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.83788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.83791: stdout chunk (state=3): >>><<< 24971 1727096433.83794: stderr chunk (state=3): >>><<< 24971 1727096433.83811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.83834: handler run complete 24971 1727096433.83975: attempt loop complete, returning result 24971 1727096433.83978: _execute() done 24971 1727096433.83980: dumping result to json 24971 1727096433.83982: done dumping result, returning 24971 1727096433.83984: done running TaskExecutor() for managed_node3/TASK: Ensure ping6 command is present [0afff68d-5257-3482-6844-000000000064] 24971 1727096433.83986: sending task result for task 0afff68d-5257-3482-6844-000000000064 24971 1727096433.84051: done sending task result for task 0afff68d-5257-3482-6844-000000000064 24971 1727096433.84054: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24971 1727096433.84131: no more pending results, returning what we have 24971 1727096433.84134: results queue empty 24971 1727096433.84136: checking for any_errors_fatal 24971 1727096433.84142: done checking for any_errors_fatal 24971 1727096433.84143: checking for max_fail_percentage 24971 1727096433.84145: done checking for max_fail_percentage 24971 1727096433.84146: checking to see if all hosts have failed and the running result is not ok 24971 1727096433.84147: done checking to see if all hosts have failed 24971 1727096433.84147: getting the remaining hosts for this loop 24971 1727096433.84149: done getting the remaining hosts for this loop 24971 1727096433.84153: getting the next task for host managed_node3 24971 1727096433.84160: done getting next task for host managed_node3 24971 1727096433.84162: ^ task is: TASK: Test gateway can be pinged 24971 1727096433.84164: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096433.84284: getting variables 24971 1727096433.84287: in VariableManager get_vars() 24971 1727096433.84329: Calling all_inventory to load vars for managed_node3 24971 1727096433.84332: Calling groups_inventory to load vars for managed_node3 24971 1727096433.84334: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096433.84345: Calling all_plugins_play to load vars for managed_node3 24971 1727096433.84348: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096433.84351: Calling groups_plugins_play to load vars for managed_node3 24971 1727096433.85776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096433.86624: done with get_vars() 24971 1727096433.86641: done getting variables 24971 1727096433.86689: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Monday 23 September 2024 09:00:33 -0400 (0:00:00.702) 0:00:21.344 ****** 24971 1727096433.86711: entering _queue_task() for managed_node3/command 24971 1727096433.86957: worker is 1 (out of 1 available) 24971 1727096433.86973: exiting _queue_task() for managed_node3/command 24971 1727096433.86984: done queuing things up, now waiting for results queue to drain 24971 1727096433.86985: waiting for pending results... 24971 1727096433.87288: running TaskExecutor() for managed_node3/TASK: Test gateway can be pinged 24971 1727096433.87293: in run() - task 0afff68d-5257-3482-6844-000000000065 24971 1727096433.87362: variable 'ansible_search_path' from source: unknown 24971 1727096433.87366: calling self._execute() 24971 1727096433.87475: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.87479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.87483: variable 'omit' from source: magic vars 24971 1727096433.87838: variable 'ansible_distribution_major_version' from source: facts 24971 1727096433.87856: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096433.87871: variable 'omit' from source: magic vars 24971 1727096433.88075: variable 'omit' from source: magic vars 24971 1727096433.88079: variable 'omit' from source: magic vars 24971 1727096433.88081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096433.88083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096433.88085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096433.88087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.88089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096433.88104: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096433.88112: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.88119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.88231: Set connection var ansible_shell_type to sh 24971 1727096433.88276: Set connection var ansible_shell_executable to /bin/sh 24971 1727096433.88279: Set connection var ansible_timeout to 10 24971 1727096433.88282: Set connection var ansible_connection to ssh 24971 1727096433.88284: Set connection var ansible_pipelining to False 24971 1727096433.88286: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096433.88294: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.88296: variable 'ansible_connection' from source: unknown 24971 1727096433.88298: variable 'ansible_module_compression' from source: unknown 24971 1727096433.88301: variable 'ansible_shell_type' from source: unknown 24971 1727096433.88304: variable 'ansible_shell_executable' from source: unknown 24971 1727096433.88306: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096433.88311: variable 'ansible_pipelining' from source: unknown 24971 1727096433.88313: variable 'ansible_timeout' from source: unknown 24971 1727096433.88316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096433.88428: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096433.88436: variable 'omit' from source: magic vars 24971 1727096433.88439: starting attempt loop 24971 1727096433.88442: running the handler 24971 1727096433.88461: _low_level_execute_command(): starting 24971 1727096433.88469: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096433.88930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.88962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.88966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096433.88973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096433.88976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.89022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.89031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.89065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.90714: stdout chunk (state=3): >>>/root <<< 24971 1727096433.90856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.90860: stdout chunk (state=3): >>><<< 24971 1727096433.90875: stderr chunk (state=3): >>><<< 24971 1727096433.91008: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.91012: _low_level_execute_command(): starting 24971 1727096433.91016: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740 `" && echo ansible-tmp-1727096433.9090703-25900-111728799208740="` echo /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740 `" ) && sleep 0' 24971 1727096433.91632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.91635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096433.91638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096433.91649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.91663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.91695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.91698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.91744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.93779: stdout chunk (state=3): >>>ansible-tmp-1727096433.9090703-25900-111728799208740=/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740 <<< 24971 1727096433.93783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.93870: stderr chunk (state=3): >>><<< 24971 1727096433.93874: stdout chunk (state=3): >>><<< 24971 1727096433.93877: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096433.9090703-25900-111728799208740=/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096433.93910: variable 'ansible_module_compression' from source: unknown 24971 1727096433.93990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096433.94016: variable 'ansible_facts' from source: unknown 24971 1727096433.94152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py 24971 1727096433.94295: Sending initial data 24971 1727096433.94301: Sent initial data (156 bytes) 24971 1727096433.94816: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096433.94825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.94913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.94917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.94919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096433.94921: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096433.94923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.94953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.94974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.95024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096433.96607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096433.96633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096433.96676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpvwknv_8j /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py <<< 24971 1727096433.96714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py" <<< 24971 1727096433.96718: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpvwknv_8j" to remote "/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py" <<< 24971 1727096433.97427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096433.97480: stderr chunk (state=3): >>><<< 24971 1727096433.97484: stdout chunk (state=3): >>><<< 24971 1727096433.97494: done transferring module to remote 24971 1727096433.97513: _low_level_execute_command(): starting 24971 1727096433.97549: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/ /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py && sleep 0' 24971 1727096433.98698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096433.98702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096433.98752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096433.98755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096433.98758: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096433.98763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096433.98816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096433.98821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096433.98848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096433.98911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.00889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.00893: stdout chunk (state=3): >>><<< 24971 1727096434.00895: stderr chunk (state=3): >>><<< 24971 1727096434.00898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096434.00900: _low_level_execute_command(): starting 24971 1727096434.00902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/AnsiballZ_command.py && sleep 0' 24971 1727096434.01504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096434.01536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096434.01559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096434.01583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096434.01605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096434.01643: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.01754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.01779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.01880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.17891: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-23 09:00:34.168371", "end": "2024-09-23 09:00:34.175250", "delta": "0:00:00.006879", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096434.19331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096434.19356: stderr chunk (state=3): >>><<< 24971 1727096434.19360: stdout chunk (state=3): >>><<< 24971 1727096434.19385: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-23 09:00:34.168371", "end": "2024-09-23 09:00:34.175250", "delta": "0:00:00.006879", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096434.19414: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096434.19422: _low_level_execute_command(): starting 24971 1727096434.19427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096433.9090703-25900-111728799208740/ > /dev/null 2>&1 && sleep 0' 24971 1727096434.19857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096434.19891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096434.19894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096434.19896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.19898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096434.19900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.19955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096434.19962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.19964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.19995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.21776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.21802: stderr chunk (state=3): >>><<< 24971 1727096434.21805: stdout chunk (state=3): >>><<< 24971 1727096434.21817: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096434.21823: handler run complete 24971 1727096434.21848: Evaluated conditional (False): False 24971 1727096434.21857: attempt loop complete, returning result 24971 1727096434.21860: _execute() done 24971 1727096434.21862: dumping result to json 24971 1727096434.21871: done dumping result, returning 24971 1727096434.21877: done running TaskExecutor() for managed_node3/TASK: Test gateway can be pinged [0afff68d-5257-3482-6844-000000000065] 24971 1727096434.21881: sending task result for task 0afff68d-5257-3482-6844-000000000065 24971 1727096434.21977: done sending task result for task 0afff68d-5257-3482-6844-000000000065 24971 1727096434.21980: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.006879", "end": "2024-09-23 09:00:34.175250", "rc": 0, "start": "2024-09-23 09:00:34.168371" } STDOUT: PING 2001:db8::1 (2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms 24971 1727096434.22046: no more pending results, returning what we have 24971 1727096434.22049: results queue empty 24971 1727096434.22050: checking for any_errors_fatal 24971 1727096434.22061: done checking for any_errors_fatal 24971 1727096434.22062: checking for max_fail_percentage 24971 1727096434.22064: done checking for max_fail_percentage 24971 1727096434.22065: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.22066: done checking to see if all hosts have failed 24971 1727096434.22066: getting the remaining hosts for this loop 24971 1727096434.22072: done getting the remaining hosts for this loop 24971 1727096434.22075: getting the next task for host managed_node3 24971 1727096434.22082: done getting next task for host managed_node3 24971 1727096434.22085: ^ task is: TASK: TEARDOWN: remove profiles. 24971 1727096434.22087: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096434.22091: getting variables 24971 1727096434.22093: in VariableManager get_vars() 24971 1727096434.22132: Calling all_inventory to load vars for managed_node3 24971 1727096434.22134: Calling groups_inventory to load vars for managed_node3 24971 1727096434.22136: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.22147: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.22149: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.22152: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.23097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.23936: done with get_vars() 24971 1727096434.23952: done getting variables 24971 1727096434.23998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Monday 23 September 2024 09:00:34 -0400 (0:00:00.373) 0:00:21.717 ****** 24971 1727096434.24019: entering _queue_task() for managed_node3/debug 24971 1727096434.24242: worker is 1 (out of 1 available) 24971 1727096434.24255: exiting _queue_task() for managed_node3/debug 24971 1727096434.24271: done queuing things up, now waiting for results queue to drain 24971 1727096434.24272: waiting for pending results... 24971 1727096434.24435: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 24971 1727096434.24500: in run() - task 0afff68d-5257-3482-6844-000000000066 24971 1727096434.24511: variable 'ansible_search_path' from source: unknown 24971 1727096434.24538: calling self._execute() 24971 1727096434.24618: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.24624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.24627: variable 'omit' from source: magic vars 24971 1727096434.24892: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.24902: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.24908: variable 'omit' from source: magic vars 24971 1727096434.24924: variable 'omit' from source: magic vars 24971 1727096434.24954: variable 'omit' from source: magic vars 24971 1727096434.24986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096434.25012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096434.25028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096434.25042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096434.25055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096434.25081: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096434.25084: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.25086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.25150: Set connection var ansible_shell_type to sh 24971 1727096434.25162: Set connection var ansible_shell_executable to /bin/sh 24971 1727096434.25171: Set connection var ansible_timeout to 10 24971 1727096434.25174: Set connection var ansible_connection to ssh 24971 1727096434.25179: Set connection var ansible_pipelining to False 24971 1727096434.25184: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096434.25201: variable 'ansible_shell_executable' from source: unknown 24971 1727096434.25204: variable 'ansible_connection' from source: unknown 24971 1727096434.25206: variable 'ansible_module_compression' from source: unknown 24971 1727096434.25209: variable 'ansible_shell_type' from source: unknown 24971 1727096434.25211: variable 'ansible_shell_executable' from source: unknown 24971 1727096434.25213: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.25217: variable 'ansible_pipelining' from source: unknown 24971 1727096434.25219: variable 'ansible_timeout' from source: unknown 24971 1727096434.25223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.25323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096434.25333: variable 'omit' from source: magic vars 24971 1727096434.25336: starting attempt loop 24971 1727096434.25339: running the handler 24971 1727096434.25381: handler run complete 24971 1727096434.25394: attempt loop complete, returning result 24971 1727096434.25397: _execute() done 24971 1727096434.25400: dumping result to json 24971 1727096434.25402: done dumping result, returning 24971 1727096434.25408: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0afff68d-5257-3482-6844-000000000066] 24971 1727096434.25413: sending task result for task 0afff68d-5257-3482-6844-000000000066 24971 1727096434.25497: done sending task result for task 0afff68d-5257-3482-6844-000000000066 24971 1727096434.25500: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 24971 1727096434.25543: no more pending results, returning what we have 24971 1727096434.25546: results queue empty 24971 1727096434.25547: checking for any_errors_fatal 24971 1727096434.25556: done checking for any_errors_fatal 24971 1727096434.25557: checking for max_fail_percentage 24971 1727096434.25559: done checking for max_fail_percentage 24971 1727096434.25560: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.25560: done checking to see if all hosts have failed 24971 1727096434.25561: getting the remaining hosts for this loop 24971 1727096434.25562: done getting the remaining hosts for this loop 24971 1727096434.25566: getting the next task for host managed_node3 24971 1727096434.25578: done getting next task for host managed_node3 24971 1727096434.25583: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24971 1727096434.25586: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096434.25608: getting variables 24971 1727096434.25609: in VariableManager get_vars() 24971 1727096434.25642: Calling all_inventory to load vars for managed_node3 24971 1727096434.25645: Calling groups_inventory to load vars for managed_node3 24971 1727096434.25647: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.25655: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.25658: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.25660: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.26435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.27299: done with get_vars() 24971 1727096434.27314: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:00:34 -0400 (0:00:00.033) 0:00:21.751 ****** 24971 1727096434.27382: entering _queue_task() for managed_node3/include_tasks 24971 1727096434.27599: worker is 1 (out of 1 available) 24971 1727096434.27611: exiting _queue_task() for managed_node3/include_tasks 24971 1727096434.27622: done queuing things up, now waiting for results queue to drain 24971 1727096434.27623: waiting for pending results... 24971 1727096434.27836: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24971 1727096434.27925: in run() - task 0afff68d-5257-3482-6844-00000000006e 24971 1727096434.27937: variable 'ansible_search_path' from source: unknown 24971 1727096434.27940: variable 'ansible_search_path' from source: unknown 24971 1727096434.27973: calling self._execute() 24971 1727096434.28043: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.28047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.28056: variable 'omit' from source: magic vars 24971 1727096434.28323: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.28330: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.28334: _execute() done 24971 1727096434.28337: dumping result to json 24971 1727096434.28342: done dumping result, returning 24971 1727096434.28349: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-3482-6844-00000000006e] 24971 1727096434.28353: sending task result for task 0afff68d-5257-3482-6844-00000000006e 24971 1727096434.28435: done sending task result for task 0afff68d-5257-3482-6844-00000000006e 24971 1727096434.28437: WORKER PROCESS EXITING 24971 1727096434.28485: no more pending results, returning what we have 24971 1727096434.28490: in VariableManager get_vars() 24971 1727096434.28531: Calling all_inventory to load vars for managed_node3 24971 1727096434.28533: Calling groups_inventory to load vars for managed_node3 24971 1727096434.28535: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.28544: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.28546: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.28548: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.29426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.30935: done with get_vars() 24971 1727096434.30952: variable 'ansible_search_path' from source: unknown 24971 1727096434.30953: variable 'ansible_search_path' from source: unknown 24971 1727096434.30990: we have included files to process 24971 1727096434.30991: generating all_blocks data 24971 1727096434.30993: done generating all_blocks data 24971 1727096434.30996: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096434.30997: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096434.30998: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24971 1727096434.31372: done processing included file 24971 1727096434.31374: iterating over new_blocks loaded from include file 24971 1727096434.31375: in VariableManager get_vars() 24971 1727096434.31392: done with get_vars() 24971 1727096434.31394: filtering new block on tags 24971 1727096434.31406: done filtering new block on tags 24971 1727096434.31409: in VariableManager get_vars() 24971 1727096434.31422: done with get_vars() 24971 1727096434.31423: filtering new block on tags 24971 1727096434.31436: done filtering new block on tags 24971 1727096434.31437: in VariableManager get_vars() 24971 1727096434.31452: done with get_vars() 24971 1727096434.31453: filtering new block on tags 24971 1727096434.31464: done filtering new block on tags 24971 1727096434.31465: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 24971 1727096434.31470: extending task lists for all hosts with included blocks 24971 1727096434.31930: done extending task lists 24971 1727096434.31932: done processing included files 24971 1727096434.31932: results queue empty 24971 1727096434.31932: checking for any_errors_fatal 24971 1727096434.31935: done checking for any_errors_fatal 24971 1727096434.31935: checking for max_fail_percentage 24971 1727096434.31936: done checking for max_fail_percentage 24971 1727096434.31936: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.31937: done checking to see if all hosts have failed 24971 1727096434.31937: getting the remaining hosts for this loop 24971 1727096434.31938: done getting the remaining hosts for this loop 24971 1727096434.31940: getting the next task for host managed_node3 24971 1727096434.31943: done getting next task for host managed_node3 24971 1727096434.31945: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24971 1727096434.31947: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096434.31955: getting variables 24971 1727096434.31955: in VariableManager get_vars() 24971 1727096434.31966: Calling all_inventory to load vars for managed_node3 24971 1727096434.31969: Calling groups_inventory to load vars for managed_node3 24971 1727096434.31971: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.31975: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.31976: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.31978: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.32616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.34707: done with get_vars() 24971 1727096434.34725: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:00:34 -0400 (0:00:00.074) 0:00:21.825 ****** 24971 1727096434.34802: entering _queue_task() for managed_node3/setup 24971 1727096434.35123: worker is 1 (out of 1 available) 24971 1727096434.35135: exiting _queue_task() for managed_node3/setup 24971 1727096434.35147: done queuing things up, now waiting for results queue to drain 24971 1727096434.35148: waiting for pending results... 24971 1727096434.35558: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24971 1727096434.35602: in run() - task 0afff68d-5257-3482-6844-000000000513 24971 1727096434.35621: variable 'ansible_search_path' from source: unknown 24971 1727096434.35628: variable 'ansible_search_path' from source: unknown 24971 1727096434.35685: calling self._execute() 24971 1727096434.35787: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.35797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.35810: variable 'omit' from source: magic vars 24971 1727096434.36187: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.36212: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.36430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096434.38754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096434.38839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096434.38882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096434.38917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096434.38951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096434.39036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096434.39145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096434.39148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096434.39150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096434.39156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096434.39210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096434.39235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096434.39267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096434.39310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096434.39325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096434.39490: variable '__network_required_facts' from source: role '' defaults 24971 1727096434.39504: variable 'ansible_facts' from source: unknown 24971 1727096434.40342: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24971 1727096434.40355: when evaluation is False, skipping this task 24971 1727096434.40363: _execute() done 24971 1727096434.40444: dumping result to json 24971 1727096434.40448: done dumping result, returning 24971 1727096434.40452: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-3482-6844-000000000513] 24971 1727096434.40455: sending task result for task 0afff68d-5257-3482-6844-000000000513 24971 1727096434.40522: done sending task result for task 0afff68d-5257-3482-6844-000000000513 24971 1727096434.40527: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096434.40578: no more pending results, returning what we have 24971 1727096434.40581: results queue empty 24971 1727096434.40582: checking for any_errors_fatal 24971 1727096434.40584: done checking for any_errors_fatal 24971 1727096434.40585: checking for max_fail_percentage 24971 1727096434.40587: done checking for max_fail_percentage 24971 1727096434.40588: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.40588: done checking to see if all hosts have failed 24971 1727096434.40589: getting the remaining hosts for this loop 24971 1727096434.40590: done getting the remaining hosts for this loop 24971 1727096434.40594: getting the next task for host managed_node3 24971 1727096434.40603: done getting next task for host managed_node3 24971 1727096434.40606: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24971 1727096434.40610: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096434.40629: getting variables 24971 1727096434.40632: in VariableManager get_vars() 24971 1727096434.40678: Calling all_inventory to load vars for managed_node3 24971 1727096434.40681: Calling groups_inventory to load vars for managed_node3 24971 1727096434.40684: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.40694: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.40697: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.40700: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.42598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.44249: done with get_vars() 24971 1727096434.44277: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:00:34 -0400 (0:00:00.095) 0:00:21.921 ****** 24971 1727096434.44375: entering _queue_task() for managed_node3/stat 24971 1727096434.44648: worker is 1 (out of 1 available) 24971 1727096434.44658: exiting _queue_task() for managed_node3/stat 24971 1727096434.44819: done queuing things up, now waiting for results queue to drain 24971 1727096434.44821: waiting for pending results... 24971 1727096434.45156: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 24971 1727096434.45161: in run() - task 0afff68d-5257-3482-6844-000000000515 24971 1727096434.45163: variable 'ansible_search_path' from source: unknown 24971 1727096434.45165: variable 'ansible_search_path' from source: unknown 24971 1727096434.45180: calling self._execute() 24971 1727096434.45280: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.45294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.45310: variable 'omit' from source: magic vars 24971 1727096434.45690: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.45709: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.45865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096434.46239: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096434.46242: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096434.46285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096434.46324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096434.46447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096434.46506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096434.46539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096434.46581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096434.46681: variable '__network_is_ostree' from source: set_fact 24971 1727096434.46693: Evaluated conditional (not __network_is_ostree is defined): False 24971 1727096434.46706: when evaluation is False, skipping this task 24971 1727096434.46718: _execute() done 24971 1727096434.46784: dumping result to json 24971 1727096434.46788: done dumping result, returning 24971 1727096434.46791: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-3482-6844-000000000515] 24971 1727096434.46793: sending task result for task 0afff68d-5257-3482-6844-000000000515 24971 1727096434.46976: done sending task result for task 0afff68d-5257-3482-6844-000000000515 24971 1727096434.46979: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24971 1727096434.47036: no more pending results, returning what we have 24971 1727096434.47048: results queue empty 24971 1727096434.47049: checking for any_errors_fatal 24971 1727096434.47058: done checking for any_errors_fatal 24971 1727096434.47059: checking for max_fail_percentage 24971 1727096434.47061: done checking for max_fail_percentage 24971 1727096434.47062: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.47063: done checking to see if all hosts have failed 24971 1727096434.47064: getting the remaining hosts for this loop 24971 1727096434.47066: done getting the remaining hosts for this loop 24971 1727096434.47074: getting the next task for host managed_node3 24971 1727096434.47081: done getting next task for host managed_node3 24971 1727096434.47084: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24971 1727096434.47088: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096434.47274: getting variables 24971 1727096434.47276: in VariableManager get_vars() 24971 1727096434.47311: Calling all_inventory to load vars for managed_node3 24971 1727096434.47314: Calling groups_inventory to load vars for managed_node3 24971 1727096434.47316: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.47325: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.47328: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.47331: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.48804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.50537: done with get_vars() 24971 1727096434.50557: done getting variables 24971 1727096434.50619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:00:34 -0400 (0:00:00.062) 0:00:21.984 ****** 24971 1727096434.50655: entering _queue_task() for managed_node3/set_fact 24971 1727096434.51039: worker is 1 (out of 1 available) 24971 1727096434.51049: exiting _queue_task() for managed_node3/set_fact 24971 1727096434.51060: done queuing things up, now waiting for results queue to drain 24971 1727096434.51060: waiting for pending results... 24971 1727096434.51242: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24971 1727096434.51411: in run() - task 0afff68d-5257-3482-6844-000000000516 24971 1727096434.51431: variable 'ansible_search_path' from source: unknown 24971 1727096434.51449: variable 'ansible_search_path' from source: unknown 24971 1727096434.51484: calling self._execute() 24971 1727096434.51612: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.51615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.51618: variable 'omit' from source: magic vars 24971 1727096434.51993: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.52046: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.52187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096434.52460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096434.52515: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096434.52558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096434.52604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096434.52698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096434.52754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096434.52758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096434.52793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096434.52890: variable '__network_is_ostree' from source: set_fact 24971 1727096434.52901: Evaluated conditional (not __network_is_ostree is defined): False 24971 1727096434.52976: when evaluation is False, skipping this task 24971 1727096434.52979: _execute() done 24971 1727096434.52982: dumping result to json 24971 1727096434.52984: done dumping result, returning 24971 1727096434.52987: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-3482-6844-000000000516] 24971 1727096434.52989: sending task result for task 0afff68d-5257-3482-6844-000000000516 24971 1727096434.53053: done sending task result for task 0afff68d-5257-3482-6844-000000000516 24971 1727096434.53056: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24971 1727096434.53221: no more pending results, returning what we have 24971 1727096434.53224: results queue empty 24971 1727096434.53225: checking for any_errors_fatal 24971 1727096434.53232: done checking for any_errors_fatal 24971 1727096434.53232: checking for max_fail_percentage 24971 1727096434.53234: done checking for max_fail_percentage 24971 1727096434.53235: checking to see if all hosts have failed and the running result is not ok 24971 1727096434.53236: done checking to see if all hosts have failed 24971 1727096434.53237: getting the remaining hosts for this loop 24971 1727096434.53238: done getting the remaining hosts for this loop 24971 1727096434.53241: getting the next task for host managed_node3 24971 1727096434.53249: done getting next task for host managed_node3 24971 1727096434.53253: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24971 1727096434.53256: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096434.53278: getting variables 24971 1727096434.53280: in VariableManager get_vars() 24971 1727096434.53394: Calling all_inventory to load vars for managed_node3 24971 1727096434.53397: Calling groups_inventory to load vars for managed_node3 24971 1727096434.53399: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096434.53414: Calling all_plugins_play to load vars for managed_node3 24971 1727096434.53417: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096434.53421: Calling groups_plugins_play to load vars for managed_node3 24971 1727096434.54947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096434.56536: done with get_vars() 24971 1727096434.56558: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:00:34 -0400 (0:00:00.060) 0:00:22.044 ****** 24971 1727096434.56658: entering _queue_task() for managed_node3/service_facts 24971 1727096434.57060: worker is 1 (out of 1 available) 24971 1727096434.57077: exiting _queue_task() for managed_node3/service_facts 24971 1727096434.57090: done queuing things up, now waiting for results queue to drain 24971 1727096434.57091: waiting for pending results... 24971 1727096434.57425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 24971 1727096434.57607: in run() - task 0afff68d-5257-3482-6844-000000000518 24971 1727096434.57610: variable 'ansible_search_path' from source: unknown 24971 1727096434.57614: variable 'ansible_search_path' from source: unknown 24971 1727096434.57798: calling self._execute() 24971 1727096434.57834: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.57847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.57863: variable 'omit' from source: magic vars 24971 1727096434.58250: variable 'ansible_distribution_major_version' from source: facts 24971 1727096434.58275: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096434.58287: variable 'omit' from source: magic vars 24971 1727096434.58366: variable 'omit' from source: magic vars 24971 1727096434.58410: variable 'omit' from source: magic vars 24971 1727096434.58457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096434.58567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096434.58574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096434.58576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096434.58580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096434.58597: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096434.58605: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.58612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.58719: Set connection var ansible_shell_type to sh 24971 1727096434.58737: Set connection var ansible_shell_executable to /bin/sh 24971 1727096434.58748: Set connection var ansible_timeout to 10 24971 1727096434.58751: Set connection var ansible_connection to ssh 24971 1727096434.58784: Set connection var ansible_pipelining to False 24971 1727096434.58787: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096434.58790: variable 'ansible_shell_executable' from source: unknown 24971 1727096434.58792: variable 'ansible_connection' from source: unknown 24971 1727096434.58794: variable 'ansible_module_compression' from source: unknown 24971 1727096434.58801: variable 'ansible_shell_type' from source: unknown 24971 1727096434.58803: variable 'ansible_shell_executable' from source: unknown 24971 1727096434.58806: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096434.58808: variable 'ansible_pipelining' from source: unknown 24971 1727096434.58810: variable 'ansible_timeout' from source: unknown 24971 1727096434.58812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096434.59001: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096434.59006: variable 'omit' from source: magic vars 24971 1727096434.59009: starting attempt loop 24971 1727096434.59012: running the handler 24971 1727096434.59174: _low_level_execute_command(): starting 24971 1727096434.59178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096434.59782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096434.59801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.59815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.59887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.61539: stdout chunk (state=3): >>>/root <<< 24971 1727096434.61703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.61707: stdout chunk (state=3): >>><<< 24971 1727096434.61713: stderr chunk (state=3): >>><<< 24971 1727096434.61829: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096434.61833: _low_level_execute_command(): starting 24971 1727096434.61836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204 `" && echo ansible-tmp-1727096434.6173882-25930-46544389924204="` echo /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204 `" ) && sleep 0' 24971 1727096434.62431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096434.62447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096434.62462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096434.62493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096434.62522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096434.62536: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096434.62550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.62584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096434.62641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.62700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.62723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.62794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.64693: stdout chunk (state=3): >>>ansible-tmp-1727096434.6173882-25930-46544389924204=/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204 <<< 24971 1727096434.64841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.64844: stdout chunk (state=3): >>><<< 24971 1727096434.64847: stderr chunk (state=3): >>><<< 24971 1727096434.64861: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096434.6173882-25930-46544389924204=/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096434.65075: variable 'ansible_module_compression' from source: unknown 24971 1727096434.65078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24971 1727096434.65081: variable 'ansible_facts' from source: unknown 24971 1727096434.65101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py 24971 1727096434.65319: Sending initial data 24971 1727096434.65328: Sent initial data (161 bytes) 24971 1727096434.65836: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096434.65853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096434.65875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096434.65980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.66002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.66066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.67615: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096434.67635: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096434.67682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096434.67745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpbugxc46c /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py <<< 24971 1727096434.67756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py" <<< 24971 1727096434.67795: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpbugxc46c" to remote "/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py" <<< 24971 1727096434.68566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.68601: stderr chunk (state=3): >>><<< 24971 1727096434.68610: stdout chunk (state=3): >>><<< 24971 1727096434.68658: done transferring module to remote 24971 1727096434.68678: _low_level_execute_command(): starting 24971 1727096434.68738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/ /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py && sleep 0' 24971 1727096434.69397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096434.69412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096434.69519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.69551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096434.69571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.69595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.69671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096434.71494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096434.71549: stderr chunk (state=3): >>><<< 24971 1727096434.71560: stdout chunk (state=3): >>><<< 24971 1727096434.71612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096434.71615: _low_level_execute_command(): starting 24971 1727096434.71618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/AnsiballZ_service_facts.py && sleep 0' 24971 1727096434.72398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096434.72413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096434.72438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096434.72461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096434.72620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.24029: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24971 1727096436.25874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096436.25878: stdout chunk (state=3): >>><<< 24971 1727096436.25881: stderr chunk (state=3): >>><<< 24971 1727096436.25885: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096436.27550: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096436.27571: _low_level_execute_command(): starting 24971 1727096436.27682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096434.6173882-25930-46544389924204/ > /dev/null 2>&1 && sleep 0' 24971 1727096436.29192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.29196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096436.29209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.29260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.31086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096436.31123: stderr chunk (state=3): >>><<< 24971 1727096436.31131: stdout chunk (state=3): >>><<< 24971 1727096436.31148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096436.31158: handler run complete 24971 1727096436.31575: variable 'ansible_facts' from source: unknown 24971 1727096436.31739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096436.32917: variable 'ansible_facts' from source: unknown 24971 1727096436.33238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096436.33935: attempt loop complete, returning result 24971 1727096436.33958: _execute() done 24971 1727096436.33974: dumping result to json 24971 1727096436.34040: done dumping result, returning 24971 1727096436.34059: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-3482-6844-000000000518] 24971 1727096436.34081: sending task result for task 0afff68d-5257-3482-6844-000000000518 24971 1727096436.36691: done sending task result for task 0afff68d-5257-3482-6844-000000000518 24971 1727096436.36694: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096436.36804: no more pending results, returning what we have 24971 1727096436.36807: results queue empty 24971 1727096436.36808: checking for any_errors_fatal 24971 1727096436.36811: done checking for any_errors_fatal 24971 1727096436.36812: checking for max_fail_percentage 24971 1727096436.36813: done checking for max_fail_percentage 24971 1727096436.36814: checking to see if all hosts have failed and the running result is not ok 24971 1727096436.36815: done checking to see if all hosts have failed 24971 1727096436.36815: getting the remaining hosts for this loop 24971 1727096436.36816: done getting the remaining hosts for this loop 24971 1727096436.36820: getting the next task for host managed_node3 24971 1727096436.36824: done getting next task for host managed_node3 24971 1727096436.36836: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24971 1727096436.36842: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096436.36853: getting variables 24971 1727096436.36854: in VariableManager get_vars() 24971 1727096436.36887: Calling all_inventory to load vars for managed_node3 24971 1727096436.36890: Calling groups_inventory to load vars for managed_node3 24971 1727096436.36892: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096436.36900: Calling all_plugins_play to load vars for managed_node3 24971 1727096436.36907: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096436.36910: Calling groups_plugins_play to load vars for managed_node3 24971 1727096436.39227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096436.40885: done with get_vars() 24971 1727096436.40908: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:00:36 -0400 (0:00:01.845) 0:00:23.890 ****** 24971 1727096436.41248: entering _queue_task() for managed_node3/package_facts 24971 1727096436.41826: worker is 1 (out of 1 available) 24971 1727096436.41835: exiting _queue_task() for managed_node3/package_facts 24971 1727096436.41846: done queuing things up, now waiting for results queue to drain 24971 1727096436.41847: waiting for pending results... 24971 1727096436.42394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 24971 1727096436.42776: in run() - task 0afff68d-5257-3482-6844-000000000519 24971 1727096436.42780: variable 'ansible_search_path' from source: unknown 24971 1727096436.42783: variable 'ansible_search_path' from source: unknown 24971 1727096436.42835: calling self._execute() 24971 1727096436.43197: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096436.43202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096436.43205: variable 'omit' from source: magic vars 24971 1727096436.44074: variable 'ansible_distribution_major_version' from source: facts 24971 1727096436.44078: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096436.44081: variable 'omit' from source: magic vars 24971 1727096436.44084: variable 'omit' from source: magic vars 24971 1727096436.44196: variable 'omit' from source: magic vars 24971 1727096436.44237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096436.44279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096436.44398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096436.44422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096436.44437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096436.44510: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096436.44520: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096436.44529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096436.44713: Set connection var ansible_shell_type to sh 24971 1727096436.44802: Set connection var ansible_shell_executable to /bin/sh 24971 1727096436.44846: Set connection var ansible_timeout to 10 24971 1727096436.44977: Set connection var ansible_connection to ssh 24971 1727096436.44980: Set connection var ansible_pipelining to False 24971 1727096436.44983: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096436.45023: variable 'ansible_shell_executable' from source: unknown 24971 1727096436.45030: variable 'ansible_connection' from source: unknown 24971 1727096436.45037: variable 'ansible_module_compression' from source: unknown 24971 1727096436.45048: variable 'ansible_shell_type' from source: unknown 24971 1727096436.45056: variable 'ansible_shell_executable' from source: unknown 24971 1727096436.45073: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096436.45082: variable 'ansible_pipelining' from source: unknown 24971 1727096436.45088: variable 'ansible_timeout' from source: unknown 24971 1727096436.45274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096436.45473: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096436.45604: variable 'omit' from source: magic vars 24971 1727096436.45613: starting attempt loop 24971 1727096436.45620: running the handler 24971 1727096436.45637: _low_level_execute_command(): starting 24971 1727096436.45648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096436.47043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096436.47060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.47462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.47558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.49162: stdout chunk (state=3): >>>/root <<< 24971 1727096436.49245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096436.49395: stderr chunk (state=3): >>><<< 24971 1727096436.49399: stdout chunk (state=3): >>><<< 24971 1727096436.49424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096436.49436: _low_level_execute_command(): starting 24971 1727096436.49442: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958 `" && echo ansible-tmp-1727096436.4942315-26012-73440718313958="` echo /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958 `" ) && sleep 0' 24971 1727096436.50689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.50792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096436.50887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096436.50895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.50971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.53075: stdout chunk (state=3): >>>ansible-tmp-1727096436.4942315-26012-73440718313958=/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958 <<< 24971 1727096436.53079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096436.53082: stdout chunk (state=3): >>><<< 24971 1727096436.53084: stderr chunk (state=3): >>><<< 24971 1727096436.53114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096436.4942315-26012-73440718313958=/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096436.53158: variable 'ansible_module_compression' from source: unknown 24971 1727096436.53351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24971 1727096436.53554: variable 'ansible_facts' from source: unknown 24971 1727096436.53855: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py 24971 1727096436.54112: Sending initial data 24971 1727096436.54116: Sent initial data (161 bytes) 24971 1727096436.55354: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096436.55403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096436.55415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096436.55429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096436.55610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.55665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096436.55688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.55963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.57701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpr0vowqf0" to remote "/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py" <<< 24971 1727096436.57705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpr0vowqf0 /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py <<< 24971 1727096436.60608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096436.60911: stderr chunk (state=3): >>><<< 24971 1727096436.60915: stdout chunk (state=3): >>><<< 24971 1727096436.60944: done transferring module to remote 24971 1727096436.60956: _low_level_execute_command(): starting 24971 1727096436.60961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/ /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py && sleep 0' 24971 1727096436.62138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096436.62205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096436.62209: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096436.62219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.62384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096436.62402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096436.62423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096436.62532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.62638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096436.64390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096436.64474: stderr chunk (state=3): >>><<< 24971 1727096436.64482: stdout chunk (state=3): >>><<< 24971 1727096436.64486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096436.64488: _low_level_execute_command(): starting 24971 1727096436.64491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/AnsiballZ_package_facts.py && sleep 0' 24971 1727096436.65742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096436.65753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096436.65769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096436.66010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096436.66013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096436.66016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096436.66117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096437.10565: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 24971 1727096437.10588: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 24971 1727096437.10978: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 24971 1727096437.10991: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24971 1727096437.12541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096437.12552: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 24971 1727096437.12682: stderr chunk (state=3): >>><<< 24971 1727096437.12692: stdout chunk (state=3): >>><<< 24971 1727096437.12782: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096437.17655: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096437.17762: _low_level_execute_command(): starting 24971 1727096437.17770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096436.4942315-26012-73440718313958/ > /dev/null 2>&1 && sleep 0' 24971 1727096437.18879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096437.18975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096437.19091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096437.19095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096437.19097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096437.19220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096437.21127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096437.21131: stdout chunk (state=3): >>><<< 24971 1727096437.21136: stderr chunk (state=3): >>><<< 24971 1727096437.21152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096437.21159: handler run complete 24971 1727096437.21921: variable 'ansible_facts' from source: unknown 24971 1727096437.22409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.25235: variable 'ansible_facts' from source: unknown 24971 1727096437.25661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.26349: attempt loop complete, returning result 24971 1727096437.26359: _execute() done 24971 1727096437.26362: dumping result to json 24971 1727096437.26561: done dumping result, returning 24971 1727096437.26574: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-3482-6844-000000000519] 24971 1727096437.26577: sending task result for task 0afff68d-5257-3482-6844-000000000519 24971 1727096437.29163: done sending task result for task 0afff68d-5257-3482-6844-000000000519 24971 1727096437.29172: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096437.29322: no more pending results, returning what we have 24971 1727096437.29325: results queue empty 24971 1727096437.29327: checking for any_errors_fatal 24971 1727096437.29332: done checking for any_errors_fatal 24971 1727096437.29333: checking for max_fail_percentage 24971 1727096437.29335: done checking for max_fail_percentage 24971 1727096437.29336: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.29337: done checking to see if all hosts have failed 24971 1727096437.29338: getting the remaining hosts for this loop 24971 1727096437.29339: done getting the remaining hosts for this loop 24971 1727096437.29343: getting the next task for host managed_node3 24971 1727096437.29350: done getting next task for host managed_node3 24971 1727096437.29353: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24971 1727096437.29356: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.29372: getting variables 24971 1727096437.29374: in VariableManager get_vars() 24971 1727096437.29515: Calling all_inventory to load vars for managed_node3 24971 1727096437.29518: Calling groups_inventory to load vars for managed_node3 24971 1727096437.29520: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.29530: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.29532: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.29534: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.31252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.33014: done with get_vars() 24971 1727096437.33042: done getting variables 24971 1727096437.33114: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:00:37 -0400 (0:00:00.919) 0:00:24.809 ****** 24971 1727096437.33155: entering _queue_task() for managed_node3/debug 24971 1727096437.33592: worker is 1 (out of 1 available) 24971 1727096437.33604: exiting _queue_task() for managed_node3/debug 24971 1727096437.33617: done queuing things up, now waiting for results queue to drain 24971 1727096437.33618: waiting for pending results... 24971 1727096437.33899: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 24971 1727096437.34039: in run() - task 0afff68d-5257-3482-6844-00000000006f 24971 1727096437.34059: variable 'ansible_search_path' from source: unknown 24971 1727096437.34066: variable 'ansible_search_path' from source: unknown 24971 1727096437.34173: calling self._execute() 24971 1727096437.34229: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.34240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.34260: variable 'omit' from source: magic vars 24971 1727096437.34669: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.34693: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.34705: variable 'omit' from source: magic vars 24971 1727096437.34779: variable 'omit' from source: magic vars 24971 1727096437.34899: variable 'network_provider' from source: set_fact 24971 1727096437.34972: variable 'omit' from source: magic vars 24971 1727096437.34977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096437.35018: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096437.35098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096437.35101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096437.35103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096437.35123: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096437.35131: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.35139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.35313: Set connection var ansible_shell_type to sh 24971 1727096437.35318: Set connection var ansible_shell_executable to /bin/sh 24971 1727096437.35320: Set connection var ansible_timeout to 10 24971 1727096437.35327: Set connection var ansible_connection to ssh 24971 1727096437.35330: Set connection var ansible_pipelining to False 24971 1727096437.35332: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096437.35361: variable 'ansible_shell_executable' from source: unknown 24971 1727096437.35370: variable 'ansible_connection' from source: unknown 24971 1727096437.35378: variable 'ansible_module_compression' from source: unknown 24971 1727096437.35423: variable 'ansible_shell_type' from source: unknown 24971 1727096437.35426: variable 'ansible_shell_executable' from source: unknown 24971 1727096437.35429: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.35431: variable 'ansible_pipelining' from source: unknown 24971 1727096437.35435: variable 'ansible_timeout' from source: unknown 24971 1727096437.35437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.35638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096437.35642: variable 'omit' from source: magic vars 24971 1727096437.35644: starting attempt loop 24971 1727096437.35646: running the handler 24971 1727096437.35649: handler run complete 24971 1727096437.35681: attempt loop complete, returning result 24971 1727096437.35689: _execute() done 24971 1727096437.35696: dumping result to json 24971 1727096437.35704: done dumping result, returning 24971 1727096437.35715: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-3482-6844-00000000006f] 24971 1727096437.35765: sending task result for task 0afff68d-5257-3482-6844-00000000006f 24971 1727096437.35832: done sending task result for task 0afff68d-5257-3482-6844-00000000006f 24971 1727096437.35835: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 24971 1727096437.35933: no more pending results, returning what we have 24971 1727096437.35936: results queue empty 24971 1727096437.35938: checking for any_errors_fatal 24971 1727096437.35950: done checking for any_errors_fatal 24971 1727096437.35951: checking for max_fail_percentage 24971 1727096437.35952: done checking for max_fail_percentage 24971 1727096437.35953: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.35954: done checking to see if all hosts have failed 24971 1727096437.35955: getting the remaining hosts for this loop 24971 1727096437.35956: done getting the remaining hosts for this loop 24971 1727096437.35960: getting the next task for host managed_node3 24971 1727096437.35966: done getting next task for host managed_node3 24971 1727096437.35970: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24971 1727096437.35974: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.35987: getting variables 24971 1727096437.35988: in VariableManager get_vars() 24971 1727096437.36029: Calling all_inventory to load vars for managed_node3 24971 1727096437.36031: Calling groups_inventory to load vars for managed_node3 24971 1727096437.36034: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.36044: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.36047: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.36050: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.42649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.44916: done with get_vars() 24971 1727096437.44984: done getting variables 24971 1727096437.45072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:00:37 -0400 (0:00:00.119) 0:00:24.928 ****** 24971 1727096437.45118: entering _queue_task() for managed_node3/fail 24971 1727096437.45693: worker is 1 (out of 1 available) 24971 1727096437.45709: exiting _queue_task() for managed_node3/fail 24971 1727096437.45724: done queuing things up, now waiting for results queue to drain 24971 1727096437.45726: waiting for pending results... 24971 1727096437.46211: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24971 1727096437.46275: in run() - task 0afff68d-5257-3482-6844-000000000070 24971 1727096437.46307: variable 'ansible_search_path' from source: unknown 24971 1727096437.46324: variable 'ansible_search_path' from source: unknown 24971 1727096437.46363: calling self._execute() 24971 1727096437.46539: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.46559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.46583: variable 'omit' from source: magic vars 24971 1727096437.47099: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.47116: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.47396: variable 'network_state' from source: role '' defaults 24971 1727096437.47402: Evaluated conditional (network_state != {}): False 24971 1727096437.47405: when evaluation is False, skipping this task 24971 1727096437.47407: _execute() done 24971 1727096437.47409: dumping result to json 24971 1727096437.47412: done dumping result, returning 24971 1727096437.47415: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-3482-6844-000000000070] 24971 1727096437.47418: sending task result for task 0afff68d-5257-3482-6844-000000000070 24971 1727096437.47493: done sending task result for task 0afff68d-5257-3482-6844-000000000070 24971 1727096437.47497: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096437.47555: no more pending results, returning what we have 24971 1727096437.47558: results queue empty 24971 1727096437.47559: checking for any_errors_fatal 24971 1727096437.47571: done checking for any_errors_fatal 24971 1727096437.47572: checking for max_fail_percentage 24971 1727096437.47574: done checking for max_fail_percentage 24971 1727096437.47575: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.47576: done checking to see if all hosts have failed 24971 1727096437.47576: getting the remaining hosts for this loop 24971 1727096437.47578: done getting the remaining hosts for this loop 24971 1727096437.47581: getting the next task for host managed_node3 24971 1727096437.47589: done getting next task for host managed_node3 24971 1727096437.47592: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24971 1727096437.47596: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.47618: getting variables 24971 1727096437.47620: in VariableManager get_vars() 24971 1727096437.47773: Calling all_inventory to load vars for managed_node3 24971 1727096437.47784: Calling groups_inventory to load vars for managed_node3 24971 1727096437.47801: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.47825: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.47833: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.47838: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.49504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.51476: done with get_vars() 24971 1727096437.51505: done getting variables 24971 1727096437.51602: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:00:37 -0400 (0:00:00.065) 0:00:24.994 ****** 24971 1727096437.51654: entering _queue_task() for managed_node3/fail 24971 1727096437.52202: worker is 1 (out of 1 available) 24971 1727096437.52215: exiting _queue_task() for managed_node3/fail 24971 1727096437.52230: done queuing things up, now waiting for results queue to drain 24971 1727096437.52232: waiting for pending results... 24971 1727096437.53031: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24971 1727096437.53040: in run() - task 0afff68d-5257-3482-6844-000000000071 24971 1727096437.53045: variable 'ansible_search_path' from source: unknown 24971 1727096437.53048: variable 'ansible_search_path' from source: unknown 24971 1727096437.53050: calling self._execute() 24971 1727096437.53256: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.53264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.53271: variable 'omit' from source: magic vars 24971 1727096437.53747: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.53758: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.53886: variable 'network_state' from source: role '' defaults 24971 1727096437.53896: Evaluated conditional (network_state != {}): False 24971 1727096437.53899: when evaluation is False, skipping this task 24971 1727096437.53902: _execute() done 24971 1727096437.53904: dumping result to json 24971 1727096437.53907: done dumping result, returning 24971 1727096437.53916: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-3482-6844-000000000071] 24971 1727096437.53919: sending task result for task 0afff68d-5257-3482-6844-000000000071 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096437.54137: no more pending results, returning what we have 24971 1727096437.54140: results queue empty 24971 1727096437.54141: checking for any_errors_fatal 24971 1727096437.54147: done checking for any_errors_fatal 24971 1727096437.54148: checking for max_fail_percentage 24971 1727096437.54149: done checking for max_fail_percentage 24971 1727096437.54150: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.54151: done checking to see if all hosts have failed 24971 1727096437.54152: getting the remaining hosts for this loop 24971 1727096437.54153: done getting the remaining hosts for this loop 24971 1727096437.54156: getting the next task for host managed_node3 24971 1727096437.54162: done getting next task for host managed_node3 24971 1727096437.54165: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24971 1727096437.54175: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.54198: getting variables 24971 1727096437.54200: in VariableManager get_vars() 24971 1727096437.54238: Calling all_inventory to load vars for managed_node3 24971 1727096437.54241: Calling groups_inventory to load vars for managed_node3 24971 1727096437.54244: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.54254: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.54256: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.54259: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.54918: done sending task result for task 0afff68d-5257-3482-6844-000000000071 24971 1727096437.54922: WORKER PROCESS EXITING 24971 1727096437.56001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.58232: done with get_vars() 24971 1727096437.58264: done getting variables 24971 1727096437.58472: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:00:37 -0400 (0:00:00.068) 0:00:25.062 ****** 24971 1727096437.58513: entering _queue_task() for managed_node3/fail 24971 1727096437.59050: worker is 1 (out of 1 available) 24971 1727096437.59062: exiting _queue_task() for managed_node3/fail 24971 1727096437.59111: done queuing things up, now waiting for results queue to drain 24971 1727096437.59112: waiting for pending results... 24971 1727096437.59459: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24971 1727096437.59613: in run() - task 0afff68d-5257-3482-6844-000000000072 24971 1727096437.59630: variable 'ansible_search_path' from source: unknown 24971 1727096437.59634: variable 'ansible_search_path' from source: unknown 24971 1727096437.59678: calling self._execute() 24971 1727096437.59798: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.59822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.59830: variable 'omit' from source: magic vars 24971 1727096437.60439: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.60493: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.60742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096437.63711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096437.63731: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096437.63805: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096437.63828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096437.63863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096437.64080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.64084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.64087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.64090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.64126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.64257: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.64271: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24971 1727096437.64387: variable 'ansible_distribution' from source: facts 24971 1727096437.64390: variable '__network_rh_distros' from source: role '' defaults 24971 1727096437.64422: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24971 1727096437.64612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.64632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.64654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.64682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.64692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.64726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.64743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.64791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.64808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.64818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.64850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.64877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.64894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.64918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.64943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.65135: variable 'network_connections' from source: task vars 24971 1727096437.65144: variable 'interface' from source: play vars 24971 1727096437.65205: variable 'interface' from source: play vars 24971 1727096437.65213: variable 'network_state' from source: role '' defaults 24971 1727096437.65300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096437.65479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096437.65513: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096437.65544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096437.65569: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096437.65613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096437.65637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096437.65657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.65676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096437.65695: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24971 1727096437.65699: when evaluation is False, skipping this task 24971 1727096437.65702: _execute() done 24971 1727096437.65704: dumping result to json 24971 1727096437.65706: done dumping result, returning 24971 1727096437.65713: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-3482-6844-000000000072] 24971 1727096437.65718: sending task result for task 0afff68d-5257-3482-6844-000000000072 24971 1727096437.65801: done sending task result for task 0afff68d-5257-3482-6844-000000000072 24971 1727096437.65803: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24971 1727096437.65848: no more pending results, returning what we have 24971 1727096437.65851: results queue empty 24971 1727096437.65852: checking for any_errors_fatal 24971 1727096437.65859: done checking for any_errors_fatal 24971 1727096437.65859: checking for max_fail_percentage 24971 1727096437.65861: done checking for max_fail_percentage 24971 1727096437.65862: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.65863: done checking to see if all hosts have failed 24971 1727096437.65863: getting the remaining hosts for this loop 24971 1727096437.65864: done getting the remaining hosts for this loop 24971 1727096437.65870: getting the next task for host managed_node3 24971 1727096437.65876: done getting next task for host managed_node3 24971 1727096437.65880: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24971 1727096437.65882: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.65900: getting variables 24971 1727096437.65902: in VariableManager get_vars() 24971 1727096437.65947: Calling all_inventory to load vars for managed_node3 24971 1727096437.65949: Calling groups_inventory to load vars for managed_node3 24971 1727096437.65951: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.65961: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.65963: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.65966: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.66913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.68255: done with get_vars() 24971 1727096437.68281: done getting variables 24971 1727096437.68339: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:00:37 -0400 (0:00:00.098) 0:00:25.161 ****** 24971 1727096437.68373: entering _queue_task() for managed_node3/dnf 24971 1727096437.68633: worker is 1 (out of 1 available) 24971 1727096437.68645: exiting _queue_task() for managed_node3/dnf 24971 1727096437.68658: done queuing things up, now waiting for results queue to drain 24971 1727096437.68660: waiting for pending results... 24971 1727096437.68860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24971 1727096437.68970: in run() - task 0afff68d-5257-3482-6844-000000000073 24971 1727096437.68986: variable 'ansible_search_path' from source: unknown 24971 1727096437.68990: variable 'ansible_search_path' from source: unknown 24971 1727096437.69017: calling self._execute() 24971 1727096437.69093: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.69096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.69108: variable 'omit' from source: magic vars 24971 1727096437.69416: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.69426: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.69583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096437.71270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096437.71327: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096437.71352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096437.71396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096437.71432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096437.71486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.71507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.71527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.71571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.71581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.71667: variable 'ansible_distribution' from source: facts 24971 1727096437.71672: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.71685: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24971 1727096437.71763: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096437.71850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.71874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.71901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.71926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.71937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.71968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.71989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.72019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.72125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.72128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.72131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.72133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.72288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.72291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.72296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.72352: variable 'network_connections' from source: task vars 24971 1727096437.72376: variable 'interface' from source: play vars 24971 1727096437.72444: variable 'interface' from source: play vars 24971 1727096437.72539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096437.72724: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096437.72769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096437.72797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096437.72855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096437.72899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096437.72915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096437.72996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.73001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096437.73089: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096437.73301: variable 'network_connections' from source: task vars 24971 1727096437.73305: variable 'interface' from source: play vars 24971 1727096437.73393: variable 'interface' from source: play vars 24971 1727096437.73396: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096437.73399: when evaluation is False, skipping this task 24971 1727096437.73401: _execute() done 24971 1727096437.73404: dumping result to json 24971 1727096437.73406: done dumping result, returning 24971 1727096437.73408: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-3482-6844-000000000073] 24971 1727096437.73410: sending task result for task 0afff68d-5257-3482-6844-000000000073 24971 1727096437.73628: done sending task result for task 0afff68d-5257-3482-6844-000000000073 24971 1727096437.73631: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096437.73688: no more pending results, returning what we have 24971 1727096437.73691: results queue empty 24971 1727096437.73692: checking for any_errors_fatal 24971 1727096437.73697: done checking for any_errors_fatal 24971 1727096437.73698: checking for max_fail_percentage 24971 1727096437.73699: done checking for max_fail_percentage 24971 1727096437.73700: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.73701: done checking to see if all hosts have failed 24971 1727096437.73702: getting the remaining hosts for this loop 24971 1727096437.73703: done getting the remaining hosts for this loop 24971 1727096437.73706: getting the next task for host managed_node3 24971 1727096437.73711: done getting next task for host managed_node3 24971 1727096437.73714: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24971 1727096437.73717: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.73732: getting variables 24971 1727096437.73737: in VariableManager get_vars() 24971 1727096437.73777: Calling all_inventory to load vars for managed_node3 24971 1727096437.73780: Calling groups_inventory to load vars for managed_node3 24971 1727096437.73782: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.73790: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.73796: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.73799: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.75659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.77384: done with get_vars() 24971 1727096437.77408: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24971 1727096437.77497: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:00:37 -0400 (0:00:00.091) 0:00:25.253 ****** 24971 1727096437.77528: entering _queue_task() for managed_node3/yum 24971 1727096437.77901: worker is 1 (out of 1 available) 24971 1727096437.77922: exiting _queue_task() for managed_node3/yum 24971 1727096437.77935: done queuing things up, now waiting for results queue to drain 24971 1727096437.77936: waiting for pending results... 24971 1727096437.78366: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24971 1727096437.78419: in run() - task 0afff68d-5257-3482-6844-000000000074 24971 1727096437.78438: variable 'ansible_search_path' from source: unknown 24971 1727096437.78446: variable 'ansible_search_path' from source: unknown 24971 1727096437.78507: calling self._execute() 24971 1727096437.78612: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.78624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.78638: variable 'omit' from source: magic vars 24971 1727096437.79117: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.79121: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.79333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096437.81680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096437.81771: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096437.81816: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096437.81892: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096437.81903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096437.82001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.82036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.82070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.82126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.82148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.82260: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.82375: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24971 1727096437.82379: when evaluation is False, skipping this task 24971 1727096437.82381: _execute() done 24971 1727096437.82383: dumping result to json 24971 1727096437.82385: done dumping result, returning 24971 1727096437.82388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-3482-6844-000000000074] 24971 1727096437.82391: sending task result for task 0afff68d-5257-3482-6844-000000000074 24971 1727096437.82466: done sending task result for task 0afff68d-5257-3482-6844-000000000074 24971 1727096437.82471: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24971 1727096437.82531: no more pending results, returning what we have 24971 1727096437.82535: results queue empty 24971 1727096437.82536: checking for any_errors_fatal 24971 1727096437.82544: done checking for any_errors_fatal 24971 1727096437.82545: checking for max_fail_percentage 24971 1727096437.82547: done checking for max_fail_percentage 24971 1727096437.82548: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.82549: done checking to see if all hosts have failed 24971 1727096437.82550: getting the remaining hosts for this loop 24971 1727096437.82551: done getting the remaining hosts for this loop 24971 1727096437.82555: getting the next task for host managed_node3 24971 1727096437.82562: done getting next task for host managed_node3 24971 1727096437.82566: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24971 1727096437.82572: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.82592: getting variables 24971 1727096437.82594: in VariableManager get_vars() 24971 1727096437.82642: Calling all_inventory to load vars for managed_node3 24971 1727096437.82646: Calling groups_inventory to load vars for managed_node3 24971 1727096437.82649: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.82660: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.82663: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.82667: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.84801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.87715: done with get_vars() 24971 1727096437.87782: done getting variables 24971 1727096437.87853: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:00:37 -0400 (0:00:00.103) 0:00:25.356 ****** 24971 1727096437.87907: entering _queue_task() for managed_node3/fail 24971 1727096437.88440: worker is 1 (out of 1 available) 24971 1727096437.88453: exiting _queue_task() for managed_node3/fail 24971 1727096437.88471: done queuing things up, now waiting for results queue to drain 24971 1727096437.88472: waiting for pending results... 24971 1727096437.88980: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24971 1727096437.88986: in run() - task 0afff68d-5257-3482-6844-000000000075 24971 1727096437.88989: variable 'ansible_search_path' from source: unknown 24971 1727096437.88992: variable 'ansible_search_path' from source: unknown 24971 1727096437.89024: calling self._execute() 24971 1727096437.89138: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.89154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.89187: variable 'omit' from source: magic vars 24971 1727096437.89623: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.89637: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.89727: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096437.89855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096437.91721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096437.91775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096437.91805: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096437.91830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096437.91850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096437.91914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.91938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.91954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.91984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.91995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.92032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.92047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.92064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.92092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.92103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.92134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.92150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.92166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.92194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.92204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.92320: variable 'network_connections' from source: task vars 24971 1727096437.92334: variable 'interface' from source: play vars 24971 1727096437.92383: variable 'interface' from source: play vars 24971 1727096437.92431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096437.92539: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096437.92580: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096437.92602: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096437.92623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096437.92655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096437.92676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096437.92695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.92712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096437.92748: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096437.92902: variable 'network_connections' from source: task vars 24971 1727096437.92906: variable 'interface' from source: play vars 24971 1727096437.92948: variable 'interface' from source: play vars 24971 1727096437.92965: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096437.92973: when evaluation is False, skipping this task 24971 1727096437.92976: _execute() done 24971 1727096437.92978: dumping result to json 24971 1727096437.92980: done dumping result, returning 24971 1727096437.92985: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-3482-6844-000000000075] 24971 1727096437.92993: sending task result for task 0afff68d-5257-3482-6844-000000000075 24971 1727096437.93081: done sending task result for task 0afff68d-5257-3482-6844-000000000075 24971 1727096437.93083: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096437.93158: no more pending results, returning what we have 24971 1727096437.93161: results queue empty 24971 1727096437.93162: checking for any_errors_fatal 24971 1727096437.93172: done checking for any_errors_fatal 24971 1727096437.93173: checking for max_fail_percentage 24971 1727096437.93175: done checking for max_fail_percentage 24971 1727096437.93176: checking to see if all hosts have failed and the running result is not ok 24971 1727096437.93177: done checking to see if all hosts have failed 24971 1727096437.93177: getting the remaining hosts for this loop 24971 1727096437.93179: done getting the remaining hosts for this loop 24971 1727096437.93183: getting the next task for host managed_node3 24971 1727096437.93188: done getting next task for host managed_node3 24971 1727096437.93192: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24971 1727096437.93195: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096437.93213: getting variables 24971 1727096437.93215: in VariableManager get_vars() 24971 1727096437.93250: Calling all_inventory to load vars for managed_node3 24971 1727096437.93252: Calling groups_inventory to load vars for managed_node3 24971 1727096437.93254: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096437.93263: Calling all_plugins_play to load vars for managed_node3 24971 1727096437.93265: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096437.93279: Calling groups_plugins_play to load vars for managed_node3 24971 1727096437.94075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096437.94958: done with get_vars() 24971 1727096437.94978: done getting variables 24971 1727096437.95024: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:00:37 -0400 (0:00:00.071) 0:00:25.428 ****** 24971 1727096437.95049: entering _queue_task() for managed_node3/package 24971 1727096437.95349: worker is 1 (out of 1 available) 24971 1727096437.95363: exiting _queue_task() for managed_node3/package 24971 1727096437.95581: done queuing things up, now waiting for results queue to drain 24971 1727096437.95583: waiting for pending results... 24971 1727096437.95724: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 24971 1727096437.95796: in run() - task 0afff68d-5257-3482-6844-000000000076 24971 1727096437.95822: variable 'ansible_search_path' from source: unknown 24971 1727096437.95825: variable 'ansible_search_path' from source: unknown 24971 1727096437.95854: calling self._execute() 24971 1727096437.95957: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096437.95975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096437.95978: variable 'omit' from source: magic vars 24971 1727096437.96365: variable 'ansible_distribution_major_version' from source: facts 24971 1727096437.96374: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096437.96552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096437.96761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096437.96807: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096437.96833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096437.96886: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096437.96965: variable 'network_packages' from source: role '' defaults 24971 1727096437.97042: variable '__network_provider_setup' from source: role '' defaults 24971 1727096437.97052: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096437.97098: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096437.97105: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096437.97152: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096437.97266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096437.98994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096437.99040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096437.99101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096437.99104: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096437.99128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096437.99214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.99244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.99262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.99295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.99335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.99351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.99373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.99397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.99429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.99441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.99676: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24971 1727096437.99877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096437.99881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096437.99883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096437.99886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096437.99888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096437.99939: variable 'ansible_python' from source: facts 24971 1727096437.99963: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24971 1727096438.00041: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096438.00117: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096438.00309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.00312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.00316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.00318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.00329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.00375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.00398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.00422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.00457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.00474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.00601: variable 'network_connections' from source: task vars 24971 1727096438.00608: variable 'interface' from source: play vars 24971 1727096438.00695: variable 'interface' from source: play vars 24971 1727096438.00744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096438.00775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096438.00792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.00819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096438.00854: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.01037: variable 'network_connections' from source: task vars 24971 1727096438.01040: variable 'interface' from source: play vars 24971 1727096438.01115: variable 'interface' from source: play vars 24971 1727096438.01138: variable '__network_packages_default_wireless' from source: role '' defaults 24971 1727096438.01195: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.01383: variable 'network_connections' from source: task vars 24971 1727096438.01387: variable 'interface' from source: play vars 24971 1727096438.01436: variable 'interface' from source: play vars 24971 1727096438.01453: variable '__network_packages_default_team' from source: role '' defaults 24971 1727096438.01508: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096438.01701: variable 'network_connections' from source: task vars 24971 1727096438.01704: variable 'interface' from source: play vars 24971 1727096438.01752: variable 'interface' from source: play vars 24971 1727096438.01791: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096438.01831: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096438.01837: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096438.01885: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096438.02018: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24971 1727096438.02312: variable 'network_connections' from source: task vars 24971 1727096438.02316: variable 'interface' from source: play vars 24971 1727096438.02357: variable 'interface' from source: play vars 24971 1727096438.02363: variable 'ansible_distribution' from source: facts 24971 1727096438.02366: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.02375: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.02388: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24971 1727096438.02493: variable 'ansible_distribution' from source: facts 24971 1727096438.02496: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.02501: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.02518: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24971 1727096438.02682: variable 'ansible_distribution' from source: facts 24971 1727096438.02685: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.02688: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.02713: variable 'network_provider' from source: set_fact 24971 1727096438.02729: variable 'ansible_facts' from source: unknown 24971 1727096438.03355: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24971 1727096438.03358: when evaluation is False, skipping this task 24971 1727096438.03361: _execute() done 24971 1727096438.03363: dumping result to json 24971 1727096438.03365: done dumping result, returning 24971 1727096438.03376: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-3482-6844-000000000076] 24971 1727096438.03379: sending task result for task 0afff68d-5257-3482-6844-000000000076 24971 1727096438.03477: done sending task result for task 0afff68d-5257-3482-6844-000000000076 24971 1727096438.03480: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24971 1727096438.03543: no more pending results, returning what we have 24971 1727096438.03546: results queue empty 24971 1727096438.03547: checking for any_errors_fatal 24971 1727096438.03554: done checking for any_errors_fatal 24971 1727096438.03555: checking for max_fail_percentage 24971 1727096438.03556: done checking for max_fail_percentage 24971 1727096438.03558: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.03558: done checking to see if all hosts have failed 24971 1727096438.03559: getting the remaining hosts for this loop 24971 1727096438.03561: done getting the remaining hosts for this loop 24971 1727096438.03565: getting the next task for host managed_node3 24971 1727096438.03625: done getting next task for host managed_node3 24971 1727096438.03629: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24971 1727096438.03632: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.03648: getting variables 24971 1727096438.03650: in VariableManager get_vars() 24971 1727096438.03730: Calling all_inventory to load vars for managed_node3 24971 1727096438.03733: Calling groups_inventory to load vars for managed_node3 24971 1727096438.03735: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.03743: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.03745: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.03748: Calling groups_plugins_play to load vars for managed_node3 24971 1727096438.04685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096438.05552: done with get_vars() 24971 1727096438.05572: done getting variables 24971 1727096438.05614: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:00:38 -0400 (0:00:00.105) 0:00:25.534 ****** 24971 1727096438.05638: entering _queue_task() for managed_node3/package 24971 1727096438.05891: worker is 1 (out of 1 available) 24971 1727096438.05905: exiting _queue_task() for managed_node3/package 24971 1727096438.05919: done queuing things up, now waiting for results queue to drain 24971 1727096438.05920: waiting for pending results... 24971 1727096438.06162: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24971 1727096438.06229: in run() - task 0afff68d-5257-3482-6844-000000000077 24971 1727096438.06241: variable 'ansible_search_path' from source: unknown 24971 1727096438.06245: variable 'ansible_search_path' from source: unknown 24971 1727096438.06279: calling self._execute() 24971 1727096438.06351: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.06358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.06366: variable 'omit' from source: magic vars 24971 1727096438.06646: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.06655: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096438.06748: variable 'network_state' from source: role '' defaults 24971 1727096438.06751: Evaluated conditional (network_state != {}): False 24971 1727096438.06754: when evaluation is False, skipping this task 24971 1727096438.06757: _execute() done 24971 1727096438.06759: dumping result to json 24971 1727096438.06761: done dumping result, returning 24971 1727096438.06772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-3482-6844-000000000077] 24971 1727096438.06779: sending task result for task 0afff68d-5257-3482-6844-000000000077 24971 1727096438.06865: done sending task result for task 0afff68d-5257-3482-6844-000000000077 24971 1727096438.06869: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096438.06950: no more pending results, returning what we have 24971 1727096438.06953: results queue empty 24971 1727096438.06954: checking for any_errors_fatal 24971 1727096438.06959: done checking for any_errors_fatal 24971 1727096438.06959: checking for max_fail_percentage 24971 1727096438.06961: done checking for max_fail_percentage 24971 1727096438.06962: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.06963: done checking to see if all hosts have failed 24971 1727096438.06963: getting the remaining hosts for this loop 24971 1727096438.06965: done getting the remaining hosts for this loop 24971 1727096438.06970: getting the next task for host managed_node3 24971 1727096438.06976: done getting next task for host managed_node3 24971 1727096438.06985: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24971 1727096438.06988: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.07004: getting variables 24971 1727096438.07005: in VariableManager get_vars() 24971 1727096438.07038: Calling all_inventory to load vars for managed_node3 24971 1727096438.07041: Calling groups_inventory to load vars for managed_node3 24971 1727096438.07043: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.07052: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.07054: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.07057: Calling groups_plugins_play to load vars for managed_node3 24971 1727096438.07814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096438.08691: done with get_vars() 24971 1727096438.08710: done getting variables 24971 1727096438.08752: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:00:38 -0400 (0:00:00.031) 0:00:25.565 ****** 24971 1727096438.08780: entering _queue_task() for managed_node3/package 24971 1727096438.09020: worker is 1 (out of 1 available) 24971 1727096438.09035: exiting _queue_task() for managed_node3/package 24971 1727096438.09049: done queuing things up, now waiting for results queue to drain 24971 1727096438.09050: waiting for pending results... 24971 1727096438.09229: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24971 1727096438.09319: in run() - task 0afff68d-5257-3482-6844-000000000078 24971 1727096438.09331: variable 'ansible_search_path' from source: unknown 24971 1727096438.09334: variable 'ansible_search_path' from source: unknown 24971 1727096438.09362: calling self._execute() 24971 1727096438.09438: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.09442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.09451: variable 'omit' from source: magic vars 24971 1727096438.09717: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.09728: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096438.09807: variable 'network_state' from source: role '' defaults 24971 1727096438.09817: Evaluated conditional (network_state != {}): False 24971 1727096438.09821: when evaluation is False, skipping this task 24971 1727096438.09824: _execute() done 24971 1727096438.09828: dumping result to json 24971 1727096438.09830: done dumping result, returning 24971 1727096438.09833: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-3482-6844-000000000078] 24971 1727096438.09845: sending task result for task 0afff68d-5257-3482-6844-000000000078 24971 1727096438.09928: done sending task result for task 0afff68d-5257-3482-6844-000000000078 24971 1727096438.09930: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096438.09998: no more pending results, returning what we have 24971 1727096438.10001: results queue empty 24971 1727096438.10002: checking for any_errors_fatal 24971 1727096438.10008: done checking for any_errors_fatal 24971 1727096438.10009: checking for max_fail_percentage 24971 1727096438.10010: done checking for max_fail_percentage 24971 1727096438.10011: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.10012: done checking to see if all hosts have failed 24971 1727096438.10012: getting the remaining hosts for this loop 24971 1727096438.10014: done getting the remaining hosts for this loop 24971 1727096438.10017: getting the next task for host managed_node3 24971 1727096438.10023: done getting next task for host managed_node3 24971 1727096438.10026: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24971 1727096438.10029: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.10046: getting variables 24971 1727096438.10047: in VariableManager get_vars() 24971 1727096438.10094: Calling all_inventory to load vars for managed_node3 24971 1727096438.10096: Calling groups_inventory to load vars for managed_node3 24971 1727096438.10099: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.10107: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.10109: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.10111: Calling groups_plugins_play to load vars for managed_node3 24971 1727096438.10976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096438.11837: done with get_vars() 24971 1727096438.11854: done getting variables 24971 1727096438.11899: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:00:38 -0400 (0:00:00.031) 0:00:25.597 ****** 24971 1727096438.11925: entering _queue_task() for managed_node3/service 24971 1727096438.12159: worker is 1 (out of 1 available) 24971 1727096438.12174: exiting _queue_task() for managed_node3/service 24971 1727096438.12188: done queuing things up, now waiting for results queue to drain 24971 1727096438.12189: waiting for pending results... 24971 1727096438.12371: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24971 1727096438.12458: in run() - task 0afff68d-5257-3482-6844-000000000079 24971 1727096438.12471: variable 'ansible_search_path' from source: unknown 24971 1727096438.12476: variable 'ansible_search_path' from source: unknown 24971 1727096438.12505: calling self._execute() 24971 1727096438.12581: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.12585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.12593: variable 'omit' from source: magic vars 24971 1727096438.12869: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.12881: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096438.12961: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.13099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096438.14665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096438.14720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096438.14749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096438.14780: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096438.14800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096438.14864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.14889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.14906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.14931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.14942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.14981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.14998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.15014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.15038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.15049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.15085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.15103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.15119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.15142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.15152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.15347: variable 'network_connections' from source: task vars 24971 1727096438.15355: variable 'interface' from source: play vars 24971 1727096438.15424: variable 'interface' from source: play vars 24971 1727096438.15479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096438.15683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096438.15741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096438.15794: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096438.15820: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096438.15858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096438.15925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096438.15930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.15949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096438.16029: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096438.16218: variable 'network_connections' from source: task vars 24971 1727096438.16221: variable 'interface' from source: play vars 24971 1727096438.16271: variable 'interface' from source: play vars 24971 1727096438.16293: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24971 1727096438.16296: when evaluation is False, skipping this task 24971 1727096438.16299: _execute() done 24971 1727096438.16301: dumping result to json 24971 1727096438.16303: done dumping result, returning 24971 1727096438.16313: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-3482-6844-000000000079] 24971 1727096438.16316: sending task result for task 0afff68d-5257-3482-6844-000000000079 24971 1727096438.16404: done sending task result for task 0afff68d-5257-3482-6844-000000000079 24971 1727096438.16412: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24971 1727096438.16457: no more pending results, returning what we have 24971 1727096438.16461: results queue empty 24971 1727096438.16462: checking for any_errors_fatal 24971 1727096438.16470: done checking for any_errors_fatal 24971 1727096438.16471: checking for max_fail_percentage 24971 1727096438.16473: done checking for max_fail_percentage 24971 1727096438.16475: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.16476: done checking to see if all hosts have failed 24971 1727096438.16476: getting the remaining hosts for this loop 24971 1727096438.16478: done getting the remaining hosts for this loop 24971 1727096438.16481: getting the next task for host managed_node3 24971 1727096438.16488: done getting next task for host managed_node3 24971 1727096438.16492: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24971 1727096438.16495: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.16513: getting variables 24971 1727096438.16515: in VariableManager get_vars() 24971 1727096438.16555: Calling all_inventory to load vars for managed_node3 24971 1727096438.16557: Calling groups_inventory to load vars for managed_node3 24971 1727096438.16559: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.16578: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.16582: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.16585: Calling groups_plugins_play to load vars for managed_node3 24971 1727096438.17886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096438.19472: done with get_vars() 24971 1727096438.19496: done getting variables 24971 1727096438.19549: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:00:38 -0400 (0:00:00.076) 0:00:25.673 ****** 24971 1727096438.19602: entering _queue_task() for managed_node3/service 24971 1727096438.20012: worker is 1 (out of 1 available) 24971 1727096438.20024: exiting _queue_task() for managed_node3/service 24971 1727096438.20037: done queuing things up, now waiting for results queue to drain 24971 1727096438.20038: waiting for pending results... 24971 1727096438.20302: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24971 1727096438.20395: in run() - task 0afff68d-5257-3482-6844-00000000007a 24971 1727096438.20402: variable 'ansible_search_path' from source: unknown 24971 1727096438.20406: variable 'ansible_search_path' from source: unknown 24971 1727096438.20434: calling self._execute() 24971 1727096438.20512: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.20515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.20527: variable 'omit' from source: magic vars 24971 1727096438.20976: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.20979: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096438.21137: variable 'network_provider' from source: set_fact 24971 1727096438.21147: variable 'network_state' from source: role '' defaults 24971 1727096438.21162: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24971 1727096438.21178: variable 'omit' from source: magic vars 24971 1727096438.21249: variable 'omit' from source: magic vars 24971 1727096438.21286: variable 'network_service_name' from source: role '' defaults 24971 1727096438.21361: variable 'network_service_name' from source: role '' defaults 24971 1727096438.21543: variable '__network_provider_setup' from source: role '' defaults 24971 1727096438.21546: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096438.21558: variable '__network_service_name_default_nm' from source: role '' defaults 24971 1727096438.21574: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096438.21638: variable '__network_packages_default_nm' from source: role '' defaults 24971 1727096438.21929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096438.24556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096438.24610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096438.24637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096438.24666: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096438.24691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096438.24751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.24778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.24797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.24822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.24834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.24866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.24889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.24906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.24932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.24942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.25099: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24971 1727096438.25177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.25196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.25216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.25240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.25251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.25318: variable 'ansible_python' from source: facts 24971 1727096438.25333: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24971 1727096438.25392: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096438.25448: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096438.25535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.25551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.25570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.25596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.25606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.25642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.25661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.25682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.25706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.25716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.25809: variable 'network_connections' from source: task vars 24971 1727096438.25815: variable 'interface' from source: play vars 24971 1727096438.25875: variable 'interface' from source: play vars 24971 1727096438.26200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096438.26681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096438.26684: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096438.26686: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096438.26688: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096438.26690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096438.26876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096438.26928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.27037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096438.27223: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.27704: variable 'network_connections' from source: task vars 24971 1727096438.27714: variable 'interface' from source: play vars 24971 1727096438.27901: variable 'interface' from source: play vars 24971 1727096438.27938: variable '__network_packages_default_wireless' from source: role '' defaults 24971 1727096438.28058: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.28738: variable 'network_connections' from source: task vars 24971 1727096438.28923: variable 'interface' from source: play vars 24971 1727096438.29109: variable 'interface' from source: play vars 24971 1727096438.29141: variable '__network_packages_default_team' from source: role '' defaults 24971 1727096438.29383: variable '__network_team_connections_defined' from source: role '' defaults 24971 1727096438.29912: variable 'network_connections' from source: task vars 24971 1727096438.29994: variable 'interface' from source: play vars 24971 1727096438.30062: variable 'interface' from source: play vars 24971 1727096438.30233: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096438.30370: variable '__network_service_name_default_initscripts' from source: role '' defaults 24971 1727096438.30745: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096438.30748: variable '__network_packages_default_initscripts' from source: role '' defaults 24971 1727096438.31143: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24971 1727096438.32360: variable 'network_connections' from source: task vars 24971 1727096438.32364: variable 'interface' from source: play vars 24971 1727096438.32431: variable 'interface' from source: play vars 24971 1727096438.32438: variable 'ansible_distribution' from source: facts 24971 1727096438.32441: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.32448: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.32462: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24971 1727096438.32982: variable 'ansible_distribution' from source: facts 24971 1727096438.32985: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.32991: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.33006: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24971 1727096438.33508: variable 'ansible_distribution' from source: facts 24971 1727096438.33511: variable '__network_rh_distros' from source: role '' defaults 24971 1727096438.33517: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.33553: variable 'network_provider' from source: set_fact 24971 1727096438.33581: variable 'omit' from source: magic vars 24971 1727096438.33610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096438.33637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096438.33654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096438.33671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096438.34134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096438.34137: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096438.34139: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.34141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.34340: Set connection var ansible_shell_type to sh 24971 1727096438.34343: Set connection var ansible_shell_executable to /bin/sh 24971 1727096438.34345: Set connection var ansible_timeout to 10 24971 1727096438.34347: Set connection var ansible_connection to ssh 24971 1727096438.34349: Set connection var ansible_pipelining to False 24971 1727096438.34351: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096438.34669: variable 'ansible_shell_executable' from source: unknown 24971 1727096438.34673: variable 'ansible_connection' from source: unknown 24971 1727096438.34675: variable 'ansible_module_compression' from source: unknown 24971 1727096438.34678: variable 'ansible_shell_type' from source: unknown 24971 1727096438.34680: variable 'ansible_shell_executable' from source: unknown 24971 1727096438.34682: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.34684: variable 'ansible_pipelining' from source: unknown 24971 1727096438.34686: variable 'ansible_timeout' from source: unknown 24971 1727096438.34688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.34702: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096438.34712: variable 'omit' from source: magic vars 24971 1727096438.34718: starting attempt loop 24971 1727096438.34721: running the handler 24971 1727096438.35197: variable 'ansible_facts' from source: unknown 24971 1727096438.36039: _low_level_execute_command(): starting 24971 1727096438.36050: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096438.36769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096438.36789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.36803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096438.36819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096438.36834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096438.36853: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096438.36895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.36966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096438.37001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.37016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096438.37093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.38783: stdout chunk (state=3): >>>/root <<< 24971 1727096438.38976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096438.38979: stdout chunk (state=3): >>><<< 24971 1727096438.38981: stderr chunk (state=3): >>><<< 24971 1727096438.38985: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096438.38987: _low_level_execute_command(): starting 24971 1727096438.39061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250 `" && echo ansible-tmp-1727096438.3894854-26074-152119167679250="` echo /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250 `" ) && sleep 0' 24971 1727096438.40223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096438.40237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.40255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096438.40295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.40396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096438.40408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.40420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096438.40500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.42445: stdout chunk (state=3): >>>ansible-tmp-1727096438.3894854-26074-152119167679250=/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250 <<< 24971 1727096438.42619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096438.42681: stderr chunk (state=3): >>><<< 24971 1727096438.42725: stdout chunk (state=3): >>><<< 24971 1727096438.42793: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096438.3894854-26074-152119167679250=/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096438.42839: variable 'ansible_module_compression' from source: unknown 24971 1727096438.43042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24971 1727096438.43162: variable 'ansible_facts' from source: unknown 24971 1727096438.43593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py 24971 1727096438.44036: Sending initial data 24971 1727096438.44045: Sent initial data (156 bytes) 24971 1727096438.45121: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.45196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.45342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096438.45382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.46995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096438.47026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096438.47076: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp13ujkcyy /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py <<< 24971 1727096438.47079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py" <<< 24971 1727096438.47116: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp13ujkcyy" to remote "/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py" <<< 24971 1727096438.49598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096438.49602: stderr chunk (state=3): >>><<< 24971 1727096438.49605: stdout chunk (state=3): >>><<< 24971 1727096438.49607: done transferring module to remote 24971 1727096438.49609: _low_level_execute_command(): starting 24971 1727096438.49611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/ /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py && sleep 0' 24971 1727096438.50687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096438.50759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.50849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.50915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096438.50961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.52876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096438.52879: stdout chunk (state=3): >>><<< 24971 1727096438.52882: stderr chunk (state=3): >>><<< 24971 1727096438.53176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096438.53181: _low_level_execute_command(): starting 24971 1727096438.53185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/AnsiballZ_systemd.py && sleep 0' 24971 1727096438.54284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096438.54366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.54396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.54430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096438.54493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.54603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.83778: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305709568", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1734022000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24971 1727096438.83804: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "system<<< 24971 1727096438.83822: stdout chunk (state=3): >>>d-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24971 1727096438.85751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096438.85823: stderr chunk (state=3): >>><<< 24971 1727096438.85826: stdout chunk (state=3): >>><<< 24971 1727096438.85837: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305709568", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1734022000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096438.85958: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096438.85979: _low_level_execute_command(): starting 24971 1727096438.85984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096438.3894854-26074-152119167679250/ > /dev/null 2>&1 && sleep 0' 24971 1727096438.86477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.86480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096438.86483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.86485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096438.86487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096438.86489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096438.86542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096438.86550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096438.86555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096438.86587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096438.88428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096438.88460: stderr chunk (state=3): >>><<< 24971 1727096438.88464: stdout chunk (state=3): >>><<< 24971 1727096438.88520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096438.88524: handler run complete 24971 1727096438.88575: attempt loop complete, returning result 24971 1727096438.88578: _execute() done 24971 1727096438.88581: dumping result to json 24971 1727096438.88615: done dumping result, returning 24971 1727096438.88618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-3482-6844-00000000007a] 24971 1727096438.88623: sending task result for task 0afff68d-5257-3482-6844-00000000007a 24971 1727096438.88892: done sending task result for task 0afff68d-5257-3482-6844-00000000007a 24971 1727096438.88895: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096438.88951: no more pending results, returning what we have 24971 1727096438.88954: results queue empty 24971 1727096438.88955: checking for any_errors_fatal 24971 1727096438.88959: done checking for any_errors_fatal 24971 1727096438.88960: checking for max_fail_percentage 24971 1727096438.88961: done checking for max_fail_percentage 24971 1727096438.88962: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.88963: done checking to see if all hosts have failed 24971 1727096438.88963: getting the remaining hosts for this loop 24971 1727096438.88965: done getting the remaining hosts for this loop 24971 1727096438.88973: getting the next task for host managed_node3 24971 1727096438.88982: done getting next task for host managed_node3 24971 1727096438.88985: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24971 1727096438.88988: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.88999: getting variables 24971 1727096438.89001: in VariableManager get_vars() 24971 1727096438.89036: Calling all_inventory to load vars for managed_node3 24971 1727096438.89038: Calling groups_inventory to load vars for managed_node3 24971 1727096438.89040: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.89052: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.89057: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.89060: Calling groups_plugins_play to load vars for managed_node3 24971 1727096438.90589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096438.92438: done with get_vars() 24971 1727096438.92479: done getting variables 24971 1727096438.92551: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:00:38 -0400 (0:00:00.729) 0:00:26.403 ****** 24971 1727096438.92603: entering _queue_task() for managed_node3/service 24971 1727096438.93197: worker is 1 (out of 1 available) 24971 1727096438.93208: exiting _queue_task() for managed_node3/service 24971 1727096438.93225: done queuing things up, now waiting for results queue to drain 24971 1727096438.93227: waiting for pending results... 24971 1727096438.93686: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24971 1727096438.93694: in run() - task 0afff68d-5257-3482-6844-00000000007b 24971 1727096438.93699: variable 'ansible_search_path' from source: unknown 24971 1727096438.93702: variable 'ansible_search_path' from source: unknown 24971 1727096438.93705: calling self._execute() 24971 1727096438.93743: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096438.93763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096438.93789: variable 'omit' from source: magic vars 24971 1727096438.94254: variable 'ansible_distribution_major_version' from source: facts 24971 1727096438.94281: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096438.94432: variable 'network_provider' from source: set_fact 24971 1727096438.94436: Evaluated conditional (network_provider == "nm"): True 24971 1727096438.94568: variable '__network_wpa_supplicant_required' from source: role '' defaults 24971 1727096438.94656: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24971 1727096438.94977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096438.98007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096438.98053: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096438.98082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096438.98107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096438.98161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096438.98220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.98245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.98265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.98314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.98332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.98362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.98385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.98401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.98428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.98439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.98466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096438.98486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096438.98516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.98553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096438.98566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096438.98672: variable 'network_connections' from source: task vars 24971 1727096438.98680: variable 'interface' from source: play vars 24971 1727096438.98740: variable 'interface' from source: play vars 24971 1727096438.98794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096438.98931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096438.98972: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096438.98998: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096438.99021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096438.99074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24971 1727096438.99098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24971 1727096438.99115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096438.99141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24971 1727096438.99196: variable '__network_wireless_connections_defined' from source: role '' defaults 24971 1727096438.99420: variable 'network_connections' from source: task vars 24971 1727096438.99424: variable 'interface' from source: play vars 24971 1727096438.99465: variable 'interface' from source: play vars 24971 1727096438.99490: Evaluated conditional (__network_wpa_supplicant_required): False 24971 1727096438.99493: when evaluation is False, skipping this task 24971 1727096438.99496: _execute() done 24971 1727096438.99498: dumping result to json 24971 1727096438.99501: done dumping result, returning 24971 1727096438.99511: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-3482-6844-00000000007b] 24971 1727096438.99521: sending task result for task 0afff68d-5257-3482-6844-00000000007b 24971 1727096438.99605: done sending task result for task 0afff68d-5257-3482-6844-00000000007b 24971 1727096438.99607: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24971 1727096438.99656: no more pending results, returning what we have 24971 1727096438.99663: results queue empty 24971 1727096438.99664: checking for any_errors_fatal 24971 1727096438.99683: done checking for any_errors_fatal 24971 1727096438.99684: checking for max_fail_percentage 24971 1727096438.99686: done checking for max_fail_percentage 24971 1727096438.99687: checking to see if all hosts have failed and the running result is not ok 24971 1727096438.99687: done checking to see if all hosts have failed 24971 1727096438.99688: getting the remaining hosts for this loop 24971 1727096438.99689: done getting the remaining hosts for this loop 24971 1727096438.99693: getting the next task for host managed_node3 24971 1727096438.99700: done getting next task for host managed_node3 24971 1727096438.99703: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24971 1727096438.99706: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096438.99726: getting variables 24971 1727096438.99727: in VariableManager get_vars() 24971 1727096438.99766: Calling all_inventory to load vars for managed_node3 24971 1727096438.99772: Calling groups_inventory to load vars for managed_node3 24971 1727096438.99774: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096438.99783: Calling all_plugins_play to load vars for managed_node3 24971 1727096438.99786: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096438.99788: Calling groups_plugins_play to load vars for managed_node3 24971 1727096439.01026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096439.02800: done with get_vars() 24971 1727096439.02905: done getting variables 24971 1727096439.03095: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:00:39 -0400 (0:00:00.105) 0:00:26.509 ****** 24971 1727096439.03131: entering _queue_task() for managed_node3/service 24971 1727096439.03708: worker is 1 (out of 1 available) 24971 1727096439.03762: exiting _queue_task() for managed_node3/service 24971 1727096439.03805: done queuing things up, now waiting for results queue to drain 24971 1727096439.03806: waiting for pending results... 24971 1727096439.04329: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 24971 1727096439.04334: in run() - task 0afff68d-5257-3482-6844-00000000007c 24971 1727096439.04337: variable 'ansible_search_path' from source: unknown 24971 1727096439.04340: variable 'ansible_search_path' from source: unknown 24971 1727096439.04343: calling self._execute() 24971 1727096439.04346: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096439.04348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096439.04351: variable 'omit' from source: magic vars 24971 1727096439.04777: variable 'ansible_distribution_major_version' from source: facts 24971 1727096439.04788: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096439.04874: variable 'network_provider' from source: set_fact 24971 1727096439.04878: Evaluated conditional (network_provider == "initscripts"): False 24971 1727096439.04881: when evaluation is False, skipping this task 24971 1727096439.04883: _execute() done 24971 1727096439.04886: dumping result to json 24971 1727096439.04888: done dumping result, returning 24971 1727096439.04894: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-3482-6844-00000000007c] 24971 1727096439.04898: sending task result for task 0afff68d-5257-3482-6844-00000000007c skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24971 1727096439.05083: no more pending results, returning what we have 24971 1727096439.05086: results queue empty 24971 1727096439.05088: checking for any_errors_fatal 24971 1727096439.05096: done checking for any_errors_fatal 24971 1727096439.05097: checking for max_fail_percentage 24971 1727096439.05099: done checking for max_fail_percentage 24971 1727096439.05099: checking to see if all hosts have failed and the running result is not ok 24971 1727096439.05100: done checking to see if all hosts have failed 24971 1727096439.05101: getting the remaining hosts for this loop 24971 1727096439.05102: done getting the remaining hosts for this loop 24971 1727096439.05105: getting the next task for host managed_node3 24971 1727096439.05111: done getting next task for host managed_node3 24971 1727096439.05115: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24971 1727096439.05118: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096439.05137: getting variables 24971 1727096439.05139: in VariableManager get_vars() 24971 1727096439.05182: Calling all_inventory to load vars for managed_node3 24971 1727096439.05184: Calling groups_inventory to load vars for managed_node3 24971 1727096439.05186: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096439.05194: Calling all_plugins_play to load vars for managed_node3 24971 1727096439.05197: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096439.05199: Calling groups_plugins_play to load vars for managed_node3 24971 1727096439.05814: done sending task result for task 0afff68d-5257-3482-6844-00000000007c 24971 1727096439.05818: WORKER PROCESS EXITING 24971 1727096439.06338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096439.08385: done with get_vars() 24971 1727096439.08409: done getting variables 24971 1727096439.08470: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:00:39 -0400 (0:00:00.053) 0:00:26.562 ****** 24971 1727096439.08506: entering _queue_task() for managed_node3/copy 24971 1727096439.09241: worker is 1 (out of 1 available) 24971 1727096439.09253: exiting _queue_task() for managed_node3/copy 24971 1727096439.09265: done queuing things up, now waiting for results queue to drain 24971 1727096439.09266: waiting for pending results... 24971 1727096439.09701: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24971 1727096439.09915: in run() - task 0afff68d-5257-3482-6844-00000000007d 24971 1727096439.09937: variable 'ansible_search_path' from source: unknown 24971 1727096439.09945: variable 'ansible_search_path' from source: unknown 24971 1727096439.10080: calling self._execute() 24971 1727096439.10183: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096439.10208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096439.10218: variable 'omit' from source: magic vars 24971 1727096439.10543: variable 'ansible_distribution_major_version' from source: facts 24971 1727096439.10553: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096439.10640: variable 'network_provider' from source: set_fact 24971 1727096439.10643: Evaluated conditional (network_provider == "initscripts"): False 24971 1727096439.10647: when evaluation is False, skipping this task 24971 1727096439.10650: _execute() done 24971 1727096439.10653: dumping result to json 24971 1727096439.10656: done dumping result, returning 24971 1727096439.10665: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-3482-6844-00000000007d] 24971 1727096439.10669: sending task result for task 0afff68d-5257-3482-6844-00000000007d 24971 1727096439.10760: done sending task result for task 0afff68d-5257-3482-6844-00000000007d 24971 1727096439.10762: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24971 1727096439.10835: no more pending results, returning what we have 24971 1727096439.10839: results queue empty 24971 1727096439.10840: checking for any_errors_fatal 24971 1727096439.10851: done checking for any_errors_fatal 24971 1727096439.10852: checking for max_fail_percentage 24971 1727096439.10858: done checking for max_fail_percentage 24971 1727096439.10860: checking to see if all hosts have failed and the running result is not ok 24971 1727096439.10861: done checking to see if all hosts have failed 24971 1727096439.10862: getting the remaining hosts for this loop 24971 1727096439.10863: done getting the remaining hosts for this loop 24971 1727096439.10869: getting the next task for host managed_node3 24971 1727096439.10874: done getting next task for host managed_node3 24971 1727096439.10878: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24971 1727096439.10881: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096439.10907: getting variables 24971 1727096439.10909: in VariableManager get_vars() 24971 1727096439.10954: Calling all_inventory to load vars for managed_node3 24971 1727096439.10958: Calling groups_inventory to load vars for managed_node3 24971 1727096439.10961: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096439.11006: Calling all_plugins_play to load vars for managed_node3 24971 1727096439.11010: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096439.11013: Calling groups_plugins_play to load vars for managed_node3 24971 1727096439.12000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096439.13924: done with get_vars() 24971 1727096439.13952: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:00:39 -0400 (0:00:00.057) 0:00:26.620 ****** 24971 1727096439.14245: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24971 1727096439.15215: worker is 1 (out of 1 available) 24971 1727096439.15228: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24971 1727096439.15241: done queuing things up, now waiting for results queue to drain 24971 1727096439.15242: waiting for pending results... 24971 1727096439.15938: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24971 1727096439.16164: in run() - task 0afff68d-5257-3482-6844-00000000007e 24971 1727096439.16196: variable 'ansible_search_path' from source: unknown 24971 1727096439.16255: variable 'ansible_search_path' from source: unknown 24971 1727096439.16431: calling self._execute() 24971 1727096439.16586: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096439.16684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096439.16688: variable 'omit' from source: magic vars 24971 1727096439.17493: variable 'ansible_distribution_major_version' from source: facts 24971 1727096439.17622: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096439.17625: variable 'omit' from source: magic vars 24971 1727096439.17773: variable 'omit' from source: magic vars 24971 1727096439.18077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096439.24240: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096439.24403: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096439.24640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096439.24644: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096439.24661: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096439.24873: variable 'network_provider' from source: set_fact 24971 1727096439.25135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096439.25231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096439.25325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096439.25512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096439.25515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096439.25586: variable 'omit' from source: magic vars 24971 1727096439.25892: variable 'omit' from source: magic vars 24971 1727096439.26173: variable 'network_connections' from source: task vars 24971 1727096439.26191: variable 'interface' from source: play vars 24971 1727096439.26256: variable 'interface' from source: play vars 24971 1727096439.26627: variable 'omit' from source: magic vars 24971 1727096439.26640: variable '__lsr_ansible_managed' from source: task vars 24971 1727096439.26773: variable '__lsr_ansible_managed' from source: task vars 24971 1727096439.28550: Loaded config def from plugin (lookup/template) 24971 1727096439.28788: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24971 1727096439.28792: File lookup term: get_ansible_managed.j2 24971 1727096439.28795: variable 'ansible_search_path' from source: unknown 24971 1727096439.28799: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24971 1727096439.28805: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24971 1727096439.28808: variable 'ansible_search_path' from source: unknown 24971 1727096439.43011: variable 'ansible_managed' from source: unknown 24971 1727096439.43432: variable 'omit' from source: magic vars 24971 1727096439.43510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096439.43546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096439.43632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096439.43711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096439.43815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096439.43835: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096439.43974: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096439.43977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096439.44251: Set connection var ansible_shell_type to sh 24971 1727096439.44254: Set connection var ansible_shell_executable to /bin/sh 24971 1727096439.44256: Set connection var ansible_timeout to 10 24971 1727096439.44258: Set connection var ansible_connection to ssh 24971 1727096439.44260: Set connection var ansible_pipelining to False 24971 1727096439.44262: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096439.44264: variable 'ansible_shell_executable' from source: unknown 24971 1727096439.44266: variable 'ansible_connection' from source: unknown 24971 1727096439.44272: variable 'ansible_module_compression' from source: unknown 24971 1727096439.44274: variable 'ansible_shell_type' from source: unknown 24971 1727096439.44276: variable 'ansible_shell_executable' from source: unknown 24971 1727096439.44278: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096439.44280: variable 'ansible_pipelining' from source: unknown 24971 1727096439.44282: variable 'ansible_timeout' from source: unknown 24971 1727096439.44284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096439.44589: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096439.44676: variable 'omit' from source: magic vars 24971 1727096439.44678: starting attempt loop 24971 1727096439.44683: running the handler 24971 1727096439.44685: _low_level_execute_command(): starting 24971 1727096439.44687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096439.46021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096439.46188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.46347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096439.46432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.46497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.48198: stdout chunk (state=3): >>>/root <<< 24971 1727096439.48348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096439.48361: stdout chunk (state=3): >>><<< 24971 1727096439.48395: stderr chunk (state=3): >>><<< 24971 1727096439.48778: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096439.48782: _low_level_execute_command(): starting 24971 1727096439.48786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149 `" && echo ansible-tmp-1727096439.4850183-26122-237101038482149="` echo /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149 `" ) && sleep 0' 24971 1727096439.49793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096439.49979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096439.50086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.50154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.52182: stdout chunk (state=3): >>>ansible-tmp-1727096439.4850183-26122-237101038482149=/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149 <<< 24971 1727096439.52254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096439.52301: stderr chunk (state=3): >>><<< 24971 1727096439.52311: stdout chunk (state=3): >>><<< 24971 1727096439.52334: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096439.4850183-26122-237101038482149=/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096439.52574: variable 'ansible_module_compression' from source: unknown 24971 1727096439.52578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24971 1727096439.52717: variable 'ansible_facts' from source: unknown 24971 1727096439.52972: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py 24971 1727096439.53343: Sending initial data 24971 1727096439.53353: Sent initial data (168 bytes) 24971 1727096439.54420: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096439.54424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096439.54426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.54428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096439.54436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096439.54438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.54646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.54688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.56327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096439.56357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096439.56491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py" <<< 24971 1727096439.56495: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpz4g4aol5 /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py <<< 24971 1727096439.56517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpz4g4aol5" to remote "/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py" <<< 24971 1727096439.57815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096439.57983: stderr chunk (state=3): >>><<< 24971 1727096439.57994: stdout chunk (state=3): >>><<< 24971 1727096439.58021: done transferring module to remote 24971 1727096439.58072: _low_level_execute_command(): starting 24971 1727096439.58076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/ /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py && sleep 0' 24971 1727096439.58690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096439.58704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096439.58785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.58832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096439.58854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096439.58872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.58940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.61038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096439.61041: stdout chunk (state=3): >>><<< 24971 1727096439.61045: stderr chunk (state=3): >>><<< 24971 1727096439.61076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096439.61080: _low_level_execute_command(): starting 24971 1727096439.61082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/AnsiballZ_network_connections.py && sleep 0' 24971 1727096439.61977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096439.61981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096439.61983: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096439.61985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.61987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096439.61989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096439.61991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.62025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096439.62028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096439.62059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.62112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.94850: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kc_cvr50/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kc_cvr50/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/02b26351-6d86-4c58-9ebc-fea256f8cb97: error=unknown <<< 24971 1727096439.94962: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24971 1727096439.96886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096439.96962: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 24971 1727096439.96965: stdout chunk (state=3): >>><<< 24971 1727096439.96969: stderr chunk (state=3): >>><<< 24971 1727096439.96992: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kc_cvr50/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kc_cvr50/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/02b26351-6d86-4c58-9ebc-fea256f8cb97: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096439.97102: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096439.97105: _low_level_execute_command(): starting 24971 1727096439.97107: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096439.4850183-26122-237101038482149/ > /dev/null 2>&1 && sleep 0' 24971 1727096439.97757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096439.97872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096439.97875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096439.97914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096439.97931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096439.97951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096439.98026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096439.99883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.00273: stderr chunk (state=3): >>><<< 24971 1727096440.00277: stdout chunk (state=3): >>><<< 24971 1727096440.00279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096440.00286: handler run complete 24971 1727096440.00288: attempt loop complete, returning result 24971 1727096440.00290: _execute() done 24971 1727096440.00292: dumping result to json 24971 1727096440.00294: done dumping result, returning 24971 1727096440.00296: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-3482-6844-00000000007e] 24971 1727096440.00298: sending task result for task 0afff68d-5257-3482-6844-00000000007e 24971 1727096440.00373: done sending task result for task 0afff68d-5257-3482-6844-00000000007e 24971 1727096440.00376: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 24971 1727096440.00582: no more pending results, returning what we have 24971 1727096440.00587: results queue empty 24971 1727096440.00588: checking for any_errors_fatal 24971 1727096440.00595: done checking for any_errors_fatal 24971 1727096440.00596: checking for max_fail_percentage 24971 1727096440.00598: done checking for max_fail_percentage 24971 1727096440.00599: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.00600: done checking to see if all hosts have failed 24971 1727096440.00600: getting the remaining hosts for this loop 24971 1727096440.00602: done getting the remaining hosts for this loop 24971 1727096440.00606: getting the next task for host managed_node3 24971 1727096440.00612: done getting next task for host managed_node3 24971 1727096440.00615: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24971 1727096440.00619: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.00630: getting variables 24971 1727096440.00631: in VariableManager get_vars() 24971 1727096440.01412: Calling all_inventory to load vars for managed_node3 24971 1727096440.01415: Calling groups_inventory to load vars for managed_node3 24971 1727096440.01417: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.01429: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.01432: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.01435: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.05174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.09007: done with get_vars() 24971 1727096440.09041: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:00:40 -0400 (0:00:00.948) 0:00:27.569 ****** 24971 1727096440.09132: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24971 1727096440.09795: worker is 1 (out of 1 available) 24971 1727096440.09810: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24971 1727096440.09825: done queuing things up, now waiting for results queue to drain 24971 1727096440.09826: waiting for pending results... 24971 1727096440.10707: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 24971 1727096440.10711: in run() - task 0afff68d-5257-3482-6844-00000000007f 24971 1727096440.10727: variable 'ansible_search_path' from source: unknown 24971 1727096440.10783: variable 'ansible_search_path' from source: unknown 24971 1727096440.10807: calling self._execute() 24971 1727096440.11051: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.11108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.11112: variable 'omit' from source: magic vars 24971 1727096440.11979: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.12192: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.12322: variable 'network_state' from source: role '' defaults 24971 1727096440.12339: Evaluated conditional (network_state != {}): False 24971 1727096440.12346: when evaluation is False, skipping this task 24971 1727096440.12354: _execute() done 24971 1727096440.12360: dumping result to json 24971 1727096440.12369: done dumping result, returning 24971 1727096440.12381: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-3482-6844-00000000007f] 24971 1727096440.12392: sending task result for task 0afff68d-5257-3482-6844-00000000007f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24971 1727096440.12552: no more pending results, returning what we have 24971 1727096440.12557: results queue empty 24971 1727096440.12558: checking for any_errors_fatal 24971 1727096440.12573: done checking for any_errors_fatal 24971 1727096440.12575: checking for max_fail_percentage 24971 1727096440.12577: done checking for max_fail_percentage 24971 1727096440.12579: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.12579: done checking to see if all hosts have failed 24971 1727096440.12580: getting the remaining hosts for this loop 24971 1727096440.12582: done getting the remaining hosts for this loop 24971 1727096440.12585: getting the next task for host managed_node3 24971 1727096440.12593: done getting next task for host managed_node3 24971 1727096440.12597: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24971 1727096440.12601: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.12621: getting variables 24971 1727096440.12622: in VariableManager get_vars() 24971 1727096440.12872: Calling all_inventory to load vars for managed_node3 24971 1727096440.12876: Calling groups_inventory to load vars for managed_node3 24971 1727096440.12879: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.12892: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.12895: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.12898: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.13419: done sending task result for task 0afff68d-5257-3482-6844-00000000007f 24971 1727096440.13423: WORKER PROCESS EXITING 24971 1727096440.15717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.19049: done with get_vars() 24971 1727096440.19140: done getting variables 24971 1727096440.19208: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:00:40 -0400 (0:00:00.102) 0:00:27.671 ****** 24971 1727096440.19360: entering _queue_task() for managed_node3/debug 24971 1727096440.20184: worker is 1 (out of 1 available) 24971 1727096440.20196: exiting _queue_task() for managed_node3/debug 24971 1727096440.20213: done queuing things up, now waiting for results queue to drain 24971 1727096440.20215: waiting for pending results... 24971 1727096440.20986: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24971 1727096440.21494: in run() - task 0afff68d-5257-3482-6844-000000000080 24971 1727096440.21498: variable 'ansible_search_path' from source: unknown 24971 1727096440.21501: variable 'ansible_search_path' from source: unknown 24971 1727096440.21504: calling self._execute() 24971 1727096440.21693: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.21892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.21896: variable 'omit' from source: magic vars 24971 1727096440.22560: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.22621: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.22633: variable 'omit' from source: magic vars 24971 1727096440.22823: variable 'omit' from source: magic vars 24971 1727096440.22849: variable 'omit' from source: magic vars 24971 1727096440.22896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096440.22969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096440.23062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096440.23088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.23162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.23201: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096440.23261: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.23270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.23356: Set connection var ansible_shell_type to sh 24971 1727096440.23584: Set connection var ansible_shell_executable to /bin/sh 24971 1727096440.23588: Set connection var ansible_timeout to 10 24971 1727096440.23590: Set connection var ansible_connection to ssh 24971 1727096440.23593: Set connection var ansible_pipelining to False 24971 1727096440.23596: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096440.23599: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.23601: variable 'ansible_connection' from source: unknown 24971 1727096440.23604: variable 'ansible_module_compression' from source: unknown 24971 1727096440.23607: variable 'ansible_shell_type' from source: unknown 24971 1727096440.23609: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.23612: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.23615: variable 'ansible_pipelining' from source: unknown 24971 1727096440.23618: variable 'ansible_timeout' from source: unknown 24971 1727096440.23620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.23951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096440.23973: variable 'omit' from source: magic vars 24971 1727096440.23984: starting attempt loop 24971 1727096440.23992: running the handler 24971 1727096440.24269: variable '__network_connections_result' from source: set_fact 24971 1727096440.24402: handler run complete 24971 1727096440.24564: attempt loop complete, returning result 24971 1727096440.24569: _execute() done 24971 1727096440.24572: dumping result to json 24971 1727096440.24574: done dumping result, returning 24971 1727096440.24577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-3482-6844-000000000080] 24971 1727096440.24579: sending task result for task 0afff68d-5257-3482-6844-000000000080 24971 1727096440.24649: done sending task result for task 0afff68d-5257-3482-6844-000000000080 24971 1727096440.24653: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 24971 1727096440.24725: no more pending results, returning what we have 24971 1727096440.24729: results queue empty 24971 1727096440.24730: checking for any_errors_fatal 24971 1727096440.24736: done checking for any_errors_fatal 24971 1727096440.24737: checking for max_fail_percentage 24971 1727096440.24739: done checking for max_fail_percentage 24971 1727096440.24740: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.24741: done checking to see if all hosts have failed 24971 1727096440.24742: getting the remaining hosts for this loop 24971 1727096440.24743: done getting the remaining hosts for this loop 24971 1727096440.24747: getting the next task for host managed_node3 24971 1727096440.24754: done getting next task for host managed_node3 24971 1727096440.24758: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24971 1727096440.24761: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.24777: getting variables 24971 1727096440.24779: in VariableManager get_vars() 24971 1727096440.24820: Calling all_inventory to load vars for managed_node3 24971 1727096440.24822: Calling groups_inventory to load vars for managed_node3 24971 1727096440.24824: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.24834: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.24837: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.24839: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.28273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.30326: done with get_vars() 24971 1727096440.30477: done getting variables 24971 1727096440.30542: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:00:40 -0400 (0:00:00.113) 0:00:27.784 ****** 24971 1727096440.30781: entering _queue_task() for managed_node3/debug 24971 1727096440.31550: worker is 1 (out of 1 available) 24971 1727096440.31562: exiting _queue_task() for managed_node3/debug 24971 1727096440.31574: done queuing things up, now waiting for results queue to drain 24971 1727096440.31575: waiting for pending results... 24971 1727096440.32210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24971 1727096440.32503: in run() - task 0afff68d-5257-3482-6844-000000000081 24971 1727096440.32518: variable 'ansible_search_path' from source: unknown 24971 1727096440.32521: variable 'ansible_search_path' from source: unknown 24971 1727096440.32641: calling self._execute() 24971 1727096440.32754: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.32758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.32771: variable 'omit' from source: magic vars 24971 1727096440.33636: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.33639: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.33642: variable 'omit' from source: magic vars 24971 1727096440.33644: variable 'omit' from source: magic vars 24971 1727096440.33666: variable 'omit' from source: magic vars 24971 1727096440.33933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096440.33972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096440.33999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096440.34074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.34078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.34081: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096440.34083: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.34085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.34295: Set connection var ansible_shell_type to sh 24971 1727096440.34307: Set connection var ansible_shell_executable to /bin/sh 24971 1727096440.34314: Set connection var ansible_timeout to 10 24971 1727096440.34371: Set connection var ansible_connection to ssh 24971 1727096440.34375: Set connection var ansible_pipelining to False 24971 1727096440.34377: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096440.34380: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.34382: variable 'ansible_connection' from source: unknown 24971 1727096440.34384: variable 'ansible_module_compression' from source: unknown 24971 1727096440.34386: variable 'ansible_shell_type' from source: unknown 24971 1727096440.34388: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.34390: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.34401: variable 'ansible_pipelining' from source: unknown 24971 1727096440.34403: variable 'ansible_timeout' from source: unknown 24971 1727096440.34541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.34820: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096440.35181: variable 'omit' from source: magic vars 24971 1727096440.35184: starting attempt loop 24971 1727096440.35187: running the handler 24971 1727096440.35189: variable '__network_connections_result' from source: set_fact 24971 1727096440.35192: variable '__network_connections_result' from source: set_fact 24971 1727096440.35424: handler run complete 24971 1727096440.35452: attempt loop complete, returning result 24971 1727096440.35456: _execute() done 24971 1727096440.35458: dumping result to json 24971 1727096440.35461: done dumping result, returning 24971 1727096440.35471: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-3482-6844-000000000081] 24971 1727096440.35479: sending task result for task 0afff68d-5257-3482-6844-000000000081 24971 1727096440.35571: done sending task result for task 0afff68d-5257-3482-6844-000000000081 24971 1727096440.35575: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24971 1727096440.35696: no more pending results, returning what we have 24971 1727096440.35701: results queue empty 24971 1727096440.35702: checking for any_errors_fatal 24971 1727096440.35710: done checking for any_errors_fatal 24971 1727096440.35711: checking for max_fail_percentage 24971 1727096440.35713: done checking for max_fail_percentage 24971 1727096440.35714: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.35715: done checking to see if all hosts have failed 24971 1727096440.35715: getting the remaining hosts for this loop 24971 1727096440.35717: done getting the remaining hosts for this loop 24971 1727096440.35721: getting the next task for host managed_node3 24971 1727096440.35727: done getting next task for host managed_node3 24971 1727096440.35731: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24971 1727096440.35735: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.35746: getting variables 24971 1727096440.35747: in VariableManager get_vars() 24971 1727096440.35792: Calling all_inventory to load vars for managed_node3 24971 1727096440.35794: Calling groups_inventory to load vars for managed_node3 24971 1727096440.35796: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.35806: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.35809: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.35812: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.38457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.41935: done with get_vars() 24971 1727096440.42085: done getting variables 24971 1727096440.42143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:00:40 -0400 (0:00:00.116) 0:00:27.900 ****** 24971 1727096440.42294: entering _queue_task() for managed_node3/debug 24971 1727096440.42893: worker is 1 (out of 1 available) 24971 1727096440.42904: exiting _queue_task() for managed_node3/debug 24971 1727096440.42915: done queuing things up, now waiting for results queue to drain 24971 1727096440.42916: waiting for pending results... 24971 1727096440.43432: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24971 1727096440.43555: in run() - task 0afff68d-5257-3482-6844-000000000082 24971 1727096440.43573: variable 'ansible_search_path' from source: unknown 24971 1727096440.43880: variable 'ansible_search_path' from source: unknown 24971 1727096440.43920: calling self._execute() 24971 1727096440.44134: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.44138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.44141: variable 'omit' from source: magic vars 24971 1727096440.44813: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.44824: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.45106: variable 'network_state' from source: role '' defaults 24971 1727096440.45117: Evaluated conditional (network_state != {}): False 24971 1727096440.45123: when evaluation is False, skipping this task 24971 1727096440.45126: _execute() done 24971 1727096440.45128: dumping result to json 24971 1727096440.45131: done dumping result, returning 24971 1727096440.45133: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-3482-6844-000000000082] 24971 1727096440.45139: sending task result for task 0afff68d-5257-3482-6844-000000000082 24971 1727096440.45606: done sending task result for task 0afff68d-5257-3482-6844-000000000082 24971 1727096440.45609: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 24971 1727096440.45664: no more pending results, returning what we have 24971 1727096440.45671: results queue empty 24971 1727096440.45672: checking for any_errors_fatal 24971 1727096440.45681: done checking for any_errors_fatal 24971 1727096440.45682: checking for max_fail_percentage 24971 1727096440.45684: done checking for max_fail_percentage 24971 1727096440.45685: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.45686: done checking to see if all hosts have failed 24971 1727096440.45687: getting the remaining hosts for this loop 24971 1727096440.45688: done getting the remaining hosts for this loop 24971 1727096440.45692: getting the next task for host managed_node3 24971 1727096440.45697: done getting next task for host managed_node3 24971 1727096440.45701: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24971 1727096440.45704: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.45727: getting variables 24971 1727096440.45729: in VariableManager get_vars() 24971 1727096440.45764: Calling all_inventory to load vars for managed_node3 24971 1727096440.45769: Calling groups_inventory to load vars for managed_node3 24971 1727096440.45771: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.45781: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.45784: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.45786: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.56061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.57762: done with get_vars() 24971 1727096440.57791: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:00:40 -0400 (0:00:00.155) 0:00:28.056 ****** 24971 1727096440.57886: entering _queue_task() for managed_node3/ping 24971 1727096440.58489: worker is 1 (out of 1 available) 24971 1727096440.58499: exiting _queue_task() for managed_node3/ping 24971 1727096440.58509: done queuing things up, now waiting for results queue to drain 24971 1727096440.58510: waiting for pending results... 24971 1727096440.58586: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 24971 1727096440.58850: in run() - task 0afff68d-5257-3482-6844-000000000083 24971 1727096440.58854: variable 'ansible_search_path' from source: unknown 24971 1727096440.58857: variable 'ansible_search_path' from source: unknown 24971 1727096440.58860: calling self._execute() 24971 1727096440.58922: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.58937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.58964: variable 'omit' from source: magic vars 24971 1727096440.59359: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.59379: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.59399: variable 'omit' from source: magic vars 24971 1727096440.59460: variable 'omit' from source: magic vars 24971 1727096440.59511: variable 'omit' from source: magic vars 24971 1727096440.59554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096440.59595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096440.59629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096440.59720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.59723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096440.59726: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096440.59729: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.59731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.59812: Set connection var ansible_shell_type to sh 24971 1727096440.59836: Set connection var ansible_shell_executable to /bin/sh 24971 1727096440.59853: Set connection var ansible_timeout to 10 24971 1727096440.59863: Set connection var ansible_connection to ssh 24971 1727096440.59876: Set connection var ansible_pipelining to False 24971 1727096440.59887: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096440.59911: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.59923: variable 'ansible_connection' from source: unknown 24971 1727096440.59956: variable 'ansible_module_compression' from source: unknown 24971 1727096440.59964: variable 'ansible_shell_type' from source: unknown 24971 1727096440.60051: variable 'ansible_shell_executable' from source: unknown 24971 1727096440.60055: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.60058: variable 'ansible_pipelining' from source: unknown 24971 1727096440.60061: variable 'ansible_timeout' from source: unknown 24971 1727096440.60063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.60308: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24971 1727096440.60327: variable 'omit' from source: magic vars 24971 1727096440.60336: starting attempt loop 24971 1727096440.60344: running the handler 24971 1727096440.60361: _low_level_execute_command(): starting 24971 1727096440.60387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096440.61093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096440.61110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096440.61151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.61253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096440.61275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.61317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.61438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.63053: stdout chunk (state=3): >>>/root <<< 24971 1727096440.63144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.63176: stderr chunk (state=3): >>><<< 24971 1727096440.63180: stdout chunk (state=3): >>><<< 24971 1727096440.63220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096440.63227: _low_level_execute_command(): starting 24971 1727096440.63233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823 `" && echo ansible-tmp-1727096440.63214-26178-198666239408823="` echo /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823 `" ) && sleep 0' 24971 1727096440.63876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096440.63880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096440.63883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096440.63895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096440.63899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096440.63910: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096440.63912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.63976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096440.63984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.64017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096440.64028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.64041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.64102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.66044: stdout chunk (state=3): >>>ansible-tmp-1727096440.63214-26178-198666239408823=/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823 <<< 24971 1727096440.66277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.66281: stdout chunk (state=3): >>><<< 24971 1727096440.66284: stderr chunk (state=3): >>><<< 24971 1727096440.66287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096440.63214-26178-198666239408823=/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096440.66308: variable 'ansible_module_compression' from source: unknown 24971 1727096440.66473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24971 1727096440.66480: variable 'ansible_facts' from source: unknown 24971 1727096440.66576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py 24971 1727096440.66784: Sending initial data 24971 1727096440.66788: Sent initial data (151 bytes) 24971 1727096440.67454: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096440.67605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096440.67611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.67639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.67703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.69334: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24971 1727096440.69366: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096440.69390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096440.69434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpaszhxw0z /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py <<< 24971 1727096440.69440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py" <<< 24971 1727096440.69466: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpaszhxw0z" to remote "/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py" <<< 24971 1727096440.69474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py" <<< 24971 1727096440.70577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.70581: stdout chunk (state=3): >>><<< 24971 1727096440.70584: stderr chunk (state=3): >>><<< 24971 1727096440.70586: done transferring module to remote 24971 1727096440.70588: _low_level_execute_command(): starting 24971 1727096440.70591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/ /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py && sleep 0' 24971 1727096440.71441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096440.71447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.71468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096440.71510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096440.71520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.71557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.71596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.71639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.73488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.73520: stderr chunk (state=3): >>><<< 24971 1727096440.73522: stdout chunk (state=3): >>><<< 24971 1727096440.73533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096440.73574: _low_level_execute_command(): starting 24971 1727096440.73578: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/AnsiballZ_ping.py && sleep 0' 24971 1727096440.73998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096440.74002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.74005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096440.74007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.74061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.74064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.74109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.89266: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24971 1727096440.90595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096440.90620: stderr chunk (state=3): >>><<< 24971 1727096440.90623: stdout chunk (state=3): >>><<< 24971 1727096440.90639: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096440.90664: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096440.90676: _low_level_execute_command(): starting 24971 1727096440.90681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096440.63214-26178-198666239408823/ > /dev/null 2>&1 && sleep 0' 24971 1727096440.91138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096440.91142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096440.91144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.91147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096440.91149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096440.91207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096440.91223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096440.91229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096440.91249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096440.93064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096440.93086: stderr chunk (state=3): >>><<< 24971 1727096440.93090: stdout chunk (state=3): >>><<< 24971 1727096440.93104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096440.93111: handler run complete 24971 1727096440.93123: attempt loop complete, returning result 24971 1727096440.93126: _execute() done 24971 1727096440.93128: dumping result to json 24971 1727096440.93130: done dumping result, returning 24971 1727096440.93139: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-3482-6844-000000000083] 24971 1727096440.93142: sending task result for task 0afff68d-5257-3482-6844-000000000083 24971 1727096440.93239: done sending task result for task 0afff68d-5257-3482-6844-000000000083 24971 1727096440.93242: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 24971 1727096440.93337: no more pending results, returning what we have 24971 1727096440.93341: results queue empty 24971 1727096440.93342: checking for any_errors_fatal 24971 1727096440.93350: done checking for any_errors_fatal 24971 1727096440.93351: checking for max_fail_percentage 24971 1727096440.93352: done checking for max_fail_percentage 24971 1727096440.93353: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.93354: done checking to see if all hosts have failed 24971 1727096440.93355: getting the remaining hosts for this loop 24971 1727096440.93357: done getting the remaining hosts for this loop 24971 1727096440.93359: getting the next task for host managed_node3 24971 1727096440.93372: done getting next task for host managed_node3 24971 1727096440.93374: ^ task is: TASK: meta (role_complete) 24971 1727096440.93378: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096440.93389: getting variables 24971 1727096440.93390: in VariableManager get_vars() 24971 1727096440.93428: Calling all_inventory to load vars for managed_node3 24971 1727096440.93430: Calling groups_inventory to load vars for managed_node3 24971 1727096440.93432: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.93441: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.93444: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.93447: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.94248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.95224: done with get_vars() 24971 1727096440.95240: done getting variables 24971 1727096440.95303: done queuing things up, now waiting for results queue to drain 24971 1727096440.95305: results queue empty 24971 1727096440.95305: checking for any_errors_fatal 24971 1727096440.95307: done checking for any_errors_fatal 24971 1727096440.95308: checking for max_fail_percentage 24971 1727096440.95308: done checking for max_fail_percentage 24971 1727096440.95309: checking to see if all hosts have failed and the running result is not ok 24971 1727096440.95309: done checking to see if all hosts have failed 24971 1727096440.95310: getting the remaining hosts for this loop 24971 1727096440.95311: done getting the remaining hosts for this loop 24971 1727096440.95313: getting the next task for host managed_node3 24971 1727096440.95318: done getting next task for host managed_node3 24971 1727096440.95320: ^ task is: TASK: Include the task 'manage_test_interface.yml' 24971 1727096440.95321: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096440.95323: getting variables 24971 1727096440.95324: in VariableManager get_vars() 24971 1727096440.95333: Calling all_inventory to load vars for managed_node3 24971 1727096440.95335: Calling groups_inventory to load vars for managed_node3 24971 1727096440.95336: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.95339: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.95341: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.95342: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.95980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.96851: done with get_vars() 24971 1727096440.96865: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Monday 23 September 2024 09:00:40 -0400 (0:00:00.390) 0:00:28.446 ****** 24971 1727096440.96919: entering _queue_task() for managed_node3/include_tasks 24971 1727096440.97199: worker is 1 (out of 1 available) 24971 1727096440.97213: exiting _queue_task() for managed_node3/include_tasks 24971 1727096440.97226: done queuing things up, now waiting for results queue to drain 24971 1727096440.97227: waiting for pending results... 24971 1727096440.97419: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 24971 1727096440.97502: in run() - task 0afff68d-5257-3482-6844-0000000000b3 24971 1727096440.97515: variable 'ansible_search_path' from source: unknown 24971 1727096440.97543: calling self._execute() 24971 1727096440.97628: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096440.97632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096440.97641: variable 'omit' from source: magic vars 24971 1727096440.97927: variable 'ansible_distribution_major_version' from source: facts 24971 1727096440.97937: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096440.97949: _execute() done 24971 1727096440.97952: dumping result to json 24971 1727096440.97955: done dumping result, returning 24971 1727096440.97957: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-3482-6844-0000000000b3] 24971 1727096440.97962: sending task result for task 0afff68d-5257-3482-6844-0000000000b3 24971 1727096440.98057: done sending task result for task 0afff68d-5257-3482-6844-0000000000b3 24971 1727096440.98060: WORKER PROCESS EXITING 24971 1727096440.98088: no more pending results, returning what we have 24971 1727096440.98093: in VariableManager get_vars() 24971 1727096440.98137: Calling all_inventory to load vars for managed_node3 24971 1727096440.98140: Calling groups_inventory to load vars for managed_node3 24971 1727096440.98142: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096440.98154: Calling all_plugins_play to load vars for managed_node3 24971 1727096440.98157: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096440.98160: Calling groups_plugins_play to load vars for managed_node3 24971 1727096440.99104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096440.99952: done with get_vars() 24971 1727096440.99966: variable 'ansible_search_path' from source: unknown 24971 1727096440.99981: we have included files to process 24971 1727096440.99982: generating all_blocks data 24971 1727096440.99983: done generating all_blocks data 24971 1727096440.99987: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096440.99987: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096440.99989: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24971 1727096441.00243: in VariableManager get_vars() 24971 1727096441.00260: done with get_vars() 24971 1727096441.00691: done processing included file 24971 1727096441.00693: iterating over new_blocks loaded from include file 24971 1727096441.00694: in VariableManager get_vars() 24971 1727096441.00706: done with get_vars() 24971 1727096441.00707: filtering new block on tags 24971 1727096441.00727: done filtering new block on tags 24971 1727096441.00728: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 24971 1727096441.00732: extending task lists for all hosts with included blocks 24971 1727096441.02193: done extending task lists 24971 1727096441.02196: done processing included files 24971 1727096441.02196: results queue empty 24971 1727096441.02197: checking for any_errors_fatal 24971 1727096441.02199: done checking for any_errors_fatal 24971 1727096441.02199: checking for max_fail_percentage 24971 1727096441.02201: done checking for max_fail_percentage 24971 1727096441.02201: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.02202: done checking to see if all hosts have failed 24971 1727096441.02203: getting the remaining hosts for this loop 24971 1727096441.02204: done getting the remaining hosts for this loop 24971 1727096441.02206: getting the next task for host managed_node3 24971 1727096441.02210: done getting next task for host managed_node3 24971 1727096441.02212: ^ task is: TASK: Ensure state in ["present", "absent"] 24971 1727096441.02214: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.02216: getting variables 24971 1727096441.02217: in VariableManager get_vars() 24971 1727096441.02233: Calling all_inventory to load vars for managed_node3 24971 1727096441.02235: Calling groups_inventory to load vars for managed_node3 24971 1727096441.02237: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.02244: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.02247: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.02250: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.03411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.05029: done with get_vars() 24971 1727096441.05061: done getting variables 24971 1727096441.05113: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:00:41 -0400 (0:00:00.082) 0:00:28.529 ****** 24971 1727096441.05145: entering _queue_task() for managed_node3/fail 24971 1727096441.05521: worker is 1 (out of 1 available) 24971 1727096441.05535: exiting _queue_task() for managed_node3/fail 24971 1727096441.05547: done queuing things up, now waiting for results queue to drain 24971 1727096441.05548: waiting for pending results... 24971 1727096441.05945: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 24971 1727096441.05951: in run() - task 0afff68d-5257-3482-6844-0000000005cc 24971 1727096441.05971: variable 'ansible_search_path' from source: unknown 24971 1727096441.05981: variable 'ansible_search_path' from source: unknown 24971 1727096441.06041: calling self._execute() 24971 1727096441.06154: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.06260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.06264: variable 'omit' from source: magic vars 24971 1727096441.06664: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.06694: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.06864: variable 'state' from source: include params 24971 1727096441.06877: Evaluated conditional (state not in ["present", "absent"]): False 24971 1727096441.06885: when evaluation is False, skipping this task 24971 1727096441.06913: _execute() done 24971 1727096441.06923: dumping result to json 24971 1727096441.06935: done dumping result, returning 24971 1727096441.06947: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-3482-6844-0000000005cc] 24971 1727096441.06958: sending task result for task 0afff68d-5257-3482-6844-0000000005cc 24971 1727096441.07224: done sending task result for task 0afff68d-5257-3482-6844-0000000005cc 24971 1727096441.07478: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 24971 1727096441.07532: no more pending results, returning what we have 24971 1727096441.07536: results queue empty 24971 1727096441.07537: checking for any_errors_fatal 24971 1727096441.07539: done checking for any_errors_fatal 24971 1727096441.07539: checking for max_fail_percentage 24971 1727096441.07541: done checking for max_fail_percentage 24971 1727096441.07542: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.07543: done checking to see if all hosts have failed 24971 1727096441.07543: getting the remaining hosts for this loop 24971 1727096441.07545: done getting the remaining hosts for this loop 24971 1727096441.07548: getting the next task for host managed_node3 24971 1727096441.07553: done getting next task for host managed_node3 24971 1727096441.07556: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 24971 1727096441.07559: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.07564: getting variables 24971 1727096441.07565: in VariableManager get_vars() 24971 1727096441.07611: Calling all_inventory to load vars for managed_node3 24971 1727096441.07614: Calling groups_inventory to load vars for managed_node3 24971 1727096441.07616: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.07626: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.07628: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.07631: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.08638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.10111: done with get_vars() 24971 1727096441.10147: done getting variables 24971 1727096441.10216: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:00:41 -0400 (0:00:00.051) 0:00:28.580 ****** 24971 1727096441.10249: entering _queue_task() for managed_node3/fail 24971 1727096441.10618: worker is 1 (out of 1 available) 24971 1727096441.10630: exiting _queue_task() for managed_node3/fail 24971 1727096441.10642: done queuing things up, now waiting for results queue to drain 24971 1727096441.10642: waiting for pending results... 24971 1727096441.10933: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 24971 1727096441.11095: in run() - task 0afff68d-5257-3482-6844-0000000005cd 24971 1727096441.11099: variable 'ansible_search_path' from source: unknown 24971 1727096441.11101: variable 'ansible_search_path' from source: unknown 24971 1727096441.11128: calling self._execute() 24971 1727096441.11230: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.11473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.11477: variable 'omit' from source: magic vars 24971 1727096441.11618: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.11635: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.11784: variable 'type' from source: play vars 24971 1727096441.11796: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 24971 1727096441.11809: when evaluation is False, skipping this task 24971 1727096441.11818: _execute() done 24971 1727096441.11826: dumping result to json 24971 1727096441.11833: done dumping result, returning 24971 1727096441.11843: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-3482-6844-0000000005cd] 24971 1727096441.11856: sending task result for task 0afff68d-5257-3482-6844-0000000005cd skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 24971 1727096441.12010: no more pending results, returning what we have 24971 1727096441.12013: results queue empty 24971 1727096441.12014: checking for any_errors_fatal 24971 1727096441.12023: done checking for any_errors_fatal 24971 1727096441.12024: checking for max_fail_percentage 24971 1727096441.12026: done checking for max_fail_percentage 24971 1727096441.12027: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.12028: done checking to see if all hosts have failed 24971 1727096441.12029: getting the remaining hosts for this loop 24971 1727096441.12030: done getting the remaining hosts for this loop 24971 1727096441.12034: getting the next task for host managed_node3 24971 1727096441.12041: done getting next task for host managed_node3 24971 1727096441.12044: ^ task is: TASK: Include the task 'show_interfaces.yml' 24971 1727096441.12048: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.12052: getting variables 24971 1727096441.12054: in VariableManager get_vars() 24971 1727096441.12100: Calling all_inventory to load vars for managed_node3 24971 1727096441.12103: Calling groups_inventory to load vars for managed_node3 24971 1727096441.12106: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.12119: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.12123: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.12126: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.12687: done sending task result for task 0afff68d-5257-3482-6844-0000000005cd 24971 1727096441.12691: WORKER PROCESS EXITING 24971 1727096441.13701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.15701: done with get_vars() 24971 1727096441.15725: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:00:41 -0400 (0:00:00.055) 0:00:28.635 ****** 24971 1727096441.15821: entering _queue_task() for managed_node3/include_tasks 24971 1727096441.16562: worker is 1 (out of 1 available) 24971 1727096441.16576: exiting _queue_task() for managed_node3/include_tasks 24971 1727096441.16591: done queuing things up, now waiting for results queue to drain 24971 1727096441.16592: waiting for pending results... 24971 1727096441.17238: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 24971 1727096441.17442: in run() - task 0afff68d-5257-3482-6844-0000000005ce 24971 1727096441.17528: variable 'ansible_search_path' from source: unknown 24971 1727096441.17533: variable 'ansible_search_path' from source: unknown 24971 1727096441.17639: calling self._execute() 24971 1727096441.17820: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.17824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.17835: variable 'omit' from source: magic vars 24971 1727096441.18416: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.18419: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.18421: _execute() done 24971 1727096441.18424: dumping result to json 24971 1727096441.18427: done dumping result, returning 24971 1727096441.18430: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-3482-6844-0000000005ce] 24971 1727096441.18433: sending task result for task 0afff68d-5257-3482-6844-0000000005ce 24971 1727096441.18508: done sending task result for task 0afff68d-5257-3482-6844-0000000005ce 24971 1727096441.18510: WORKER PROCESS EXITING 24971 1727096441.18546: no more pending results, returning what we have 24971 1727096441.18551: in VariableManager get_vars() 24971 1727096441.18600: Calling all_inventory to load vars for managed_node3 24971 1727096441.18603: Calling groups_inventory to load vars for managed_node3 24971 1727096441.18605: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.18619: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.18622: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.18624: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.20122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.21911: done with get_vars() 24971 1727096441.21942: variable 'ansible_search_path' from source: unknown 24971 1727096441.21944: variable 'ansible_search_path' from source: unknown 24971 1727096441.22013: we have included files to process 24971 1727096441.22014: generating all_blocks data 24971 1727096441.22016: done generating all_blocks data 24971 1727096441.22037: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096441.22038: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096441.22041: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24971 1727096441.22182: in VariableManager get_vars() 24971 1727096441.22232: done with get_vars() 24971 1727096441.22366: done processing included file 24971 1727096441.22372: iterating over new_blocks loaded from include file 24971 1727096441.22374: in VariableManager get_vars() 24971 1727096441.22393: done with get_vars() 24971 1727096441.22395: filtering new block on tags 24971 1727096441.22433: done filtering new block on tags 24971 1727096441.22436: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 24971 1727096441.22441: extending task lists for all hosts with included blocks 24971 1727096441.22932: done extending task lists 24971 1727096441.22934: done processing included files 24971 1727096441.22935: results queue empty 24971 1727096441.22935: checking for any_errors_fatal 24971 1727096441.22939: done checking for any_errors_fatal 24971 1727096441.22939: checking for max_fail_percentage 24971 1727096441.22940: done checking for max_fail_percentage 24971 1727096441.22941: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.22942: done checking to see if all hosts have failed 24971 1727096441.22943: getting the remaining hosts for this loop 24971 1727096441.22944: done getting the remaining hosts for this loop 24971 1727096441.22946: getting the next task for host managed_node3 24971 1727096441.22951: done getting next task for host managed_node3 24971 1727096441.22958: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24971 1727096441.22965: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.22975: getting variables 24971 1727096441.22977: in VariableManager get_vars() 24971 1727096441.22991: Calling all_inventory to load vars for managed_node3 24971 1727096441.22994: Calling groups_inventory to load vars for managed_node3 24971 1727096441.22996: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.23002: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.23004: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.23007: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.24459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.27021: done with get_vars() 24971 1727096441.27049: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:00:41 -0400 (0:00:00.113) 0:00:28.749 ****** 24971 1727096441.27155: entering _queue_task() for managed_node3/include_tasks 24971 1727096441.27529: worker is 1 (out of 1 available) 24971 1727096441.27542: exiting _queue_task() for managed_node3/include_tasks 24971 1727096441.27554: done queuing things up, now waiting for results queue to drain 24971 1727096441.27555: waiting for pending results... 24971 1727096441.27896: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 24971 1727096441.27944: in run() - task 0afff68d-5257-3482-6844-0000000006e4 24971 1727096441.27972: variable 'ansible_search_path' from source: unknown 24971 1727096441.27977: variable 'ansible_search_path' from source: unknown 24971 1727096441.28007: calling self._execute() 24971 1727096441.28174: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.28178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.28181: variable 'omit' from source: magic vars 24971 1727096441.28495: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.28508: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.28516: _execute() done 24971 1727096441.28519: dumping result to json 24971 1727096441.28522: done dumping result, returning 24971 1727096441.28534: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-3482-6844-0000000006e4] 24971 1727096441.28536: sending task result for task 0afff68d-5257-3482-6844-0000000006e4 24971 1727096441.28628: done sending task result for task 0afff68d-5257-3482-6844-0000000006e4 24971 1727096441.28631: WORKER PROCESS EXITING 24971 1727096441.28671: no more pending results, returning what we have 24971 1727096441.28676: in VariableManager get_vars() 24971 1727096441.28721: Calling all_inventory to load vars for managed_node3 24971 1727096441.28724: Calling groups_inventory to load vars for managed_node3 24971 1727096441.28727: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.28739: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.28743: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.28745: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.31061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.32790: done with get_vars() 24971 1727096441.32811: variable 'ansible_search_path' from source: unknown 24971 1727096441.32812: variable 'ansible_search_path' from source: unknown 24971 1727096441.32882: we have included files to process 24971 1727096441.32884: generating all_blocks data 24971 1727096441.32885: done generating all_blocks data 24971 1727096441.32886: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096441.32887: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096441.32890: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24971 1727096441.33163: done processing included file 24971 1727096441.33164: iterating over new_blocks loaded from include file 24971 1727096441.33176: in VariableManager get_vars() 24971 1727096441.33198: done with get_vars() 24971 1727096441.33200: filtering new block on tags 24971 1727096441.33218: done filtering new block on tags 24971 1727096441.33220: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 24971 1727096441.33225: extending task lists for all hosts with included blocks 24971 1727096441.33385: done extending task lists 24971 1727096441.33392: done processing included files 24971 1727096441.33393: results queue empty 24971 1727096441.33394: checking for any_errors_fatal 24971 1727096441.33397: done checking for any_errors_fatal 24971 1727096441.33398: checking for max_fail_percentage 24971 1727096441.33399: done checking for max_fail_percentage 24971 1727096441.33400: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.33401: done checking to see if all hosts have failed 24971 1727096441.33401: getting the remaining hosts for this loop 24971 1727096441.33402: done getting the remaining hosts for this loop 24971 1727096441.33405: getting the next task for host managed_node3 24971 1727096441.33409: done getting next task for host managed_node3 24971 1727096441.33412: ^ task is: TASK: Gather current interface info 24971 1727096441.33415: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.33417: getting variables 24971 1727096441.33418: in VariableManager get_vars() 24971 1727096441.33432: Calling all_inventory to load vars for managed_node3 24971 1727096441.33434: Calling groups_inventory to load vars for managed_node3 24971 1727096441.33436: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.33441: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.33443: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.33447: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.34719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.36844: done with get_vars() 24971 1727096441.36877: done getting variables 24971 1727096441.36922: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:00:41 -0400 (0:00:00.097) 0:00:28.847 ****** 24971 1727096441.36954: entering _queue_task() for managed_node3/command 24971 1727096441.37322: worker is 1 (out of 1 available) 24971 1727096441.37335: exiting _queue_task() for managed_node3/command 24971 1727096441.37347: done queuing things up, now waiting for results queue to drain 24971 1727096441.37348: waiting for pending results... 24971 1727096441.37782: running TaskExecutor() for managed_node3/TASK: Gather current interface info 24971 1727096441.37787: in run() - task 0afff68d-5257-3482-6844-00000000071b 24971 1727096441.37791: variable 'ansible_search_path' from source: unknown 24971 1727096441.37794: variable 'ansible_search_path' from source: unknown 24971 1727096441.37821: calling self._execute() 24971 1727096441.37929: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.37933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.37946: variable 'omit' from source: magic vars 24971 1727096441.38349: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.38361: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.38369: variable 'omit' from source: magic vars 24971 1727096441.38433: variable 'omit' from source: magic vars 24971 1727096441.38466: variable 'omit' from source: magic vars 24971 1727096441.38510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096441.38554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096441.38576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096441.38672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.38676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.38678: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096441.38681: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.38683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.38756: Set connection var ansible_shell_type to sh 24971 1727096441.38764: Set connection var ansible_shell_executable to /bin/sh 24971 1727096441.38778: Set connection var ansible_timeout to 10 24971 1727096441.38783: Set connection var ansible_connection to ssh 24971 1727096441.38789: Set connection var ansible_pipelining to False 24971 1727096441.38794: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096441.38817: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.38820: variable 'ansible_connection' from source: unknown 24971 1727096441.38823: variable 'ansible_module_compression' from source: unknown 24971 1727096441.38826: variable 'ansible_shell_type' from source: unknown 24971 1727096441.38828: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.38830: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.38832: variable 'ansible_pipelining' from source: unknown 24971 1727096441.38845: variable 'ansible_timeout' from source: unknown 24971 1727096441.38848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.39173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096441.39178: variable 'omit' from source: magic vars 24971 1727096441.39180: starting attempt loop 24971 1727096441.39182: running the handler 24971 1727096441.39185: _low_level_execute_command(): starting 24971 1727096441.39186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096441.39837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.39924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.39995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.41706: stdout chunk (state=3): >>>/root <<< 24971 1727096441.41837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096441.41851: stdout chunk (state=3): >>><<< 24971 1727096441.41865: stderr chunk (state=3): >>><<< 24971 1727096441.41904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096441.41926: _low_level_execute_command(): starting 24971 1727096441.41945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288 `" && echo ansible-tmp-1727096441.419124-26220-210076231593288="` echo /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288 `" ) && sleep 0' 24971 1727096441.42602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096441.42681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.42735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096441.42756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096441.42769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.42833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.44718: stdout chunk (state=3): >>>ansible-tmp-1727096441.419124-26220-210076231593288=/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288 <<< 24971 1727096441.44862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096441.44895: stdout chunk (state=3): >>><<< 24971 1727096441.44899: stderr chunk (state=3): >>><<< 24971 1727096441.44918: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096441.419124-26220-210076231593288=/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096441.44973: variable 'ansible_module_compression' from source: unknown 24971 1727096441.45025: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096441.45066: variable 'ansible_facts' from source: unknown 24971 1727096441.45174: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py 24971 1727096441.45428: Sending initial data 24971 1727096441.45432: Sent initial data (155 bytes) 24971 1727096441.46092: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.46118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096441.46133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096441.46153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.46225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.47799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096441.47863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096441.47914: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpeoifr_j9 /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py <<< 24971 1727096441.47917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py" <<< 24971 1727096441.47964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpeoifr_j9" to remote "/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py" <<< 24971 1727096441.48776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096441.48826: stderr chunk (state=3): >>><<< 24971 1727096441.48829: stdout chunk (state=3): >>><<< 24971 1727096441.48956: done transferring module to remote 24971 1727096441.48959: _low_level_execute_command(): starting 24971 1727096441.48962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/ /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py && sleep 0' 24971 1727096441.49735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096441.49750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096441.49765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096441.49794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096441.49836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096441.49957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.50000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096441.50034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096441.50062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.50135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.51975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096441.51979: stdout chunk (state=3): >>><<< 24971 1727096441.51981: stderr chunk (state=3): >>><<< 24971 1727096441.52087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096441.52096: _low_level_execute_command(): starting 24971 1727096441.52099: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/AnsiballZ_command.py && sleep 0' 24971 1727096441.52611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096441.52615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096441.52618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096441.52620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096441.52622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096441.52625: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096441.52631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.52681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096441.52684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096441.52687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096441.52689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096441.52691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096441.52700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096441.52702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096441.52719: stderr chunk (state=3): >>>debug2: match found <<< 24971 1727096441.52721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096441.52786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096441.52800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096441.52840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.52913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.68808: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:41.681013", "end": "2024-09-23 09:00:41.684386", "delta": "0:00:00.003373", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096441.70690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096441.70694: stdout chunk (state=3): >>><<< 24971 1727096441.70697: stderr chunk (state=3): >>><<< 24971 1727096441.70699: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:00:41.681013", "end": "2024-09-23 09:00:41.684386", "delta": "0:00:00.003373", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096441.70701: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096441.70703: _low_level_execute_command(): starting 24971 1727096441.70794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096441.419124-26220-210076231593288/ > /dev/null 2>&1 && sleep 0' 24971 1727096441.71941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096441.72181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096441.72287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096441.72356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096441.74217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096441.74287: stderr chunk (state=3): >>><<< 24971 1727096441.74673: stdout chunk (state=3): >>><<< 24971 1727096441.74679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096441.74682: handler run complete 24971 1727096441.74685: Evaluated conditional (False): False 24971 1727096441.74687: attempt loop complete, returning result 24971 1727096441.74689: _execute() done 24971 1727096441.74691: dumping result to json 24971 1727096441.74693: done dumping result, returning 24971 1727096441.74695: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-3482-6844-00000000071b] 24971 1727096441.74697: sending task result for task 0afff68d-5257-3482-6844-00000000071b 24971 1727096441.74770: done sending task result for task 0afff68d-5257-3482-6844-00000000071b 24971 1727096441.74773: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003373", "end": "2024-09-23 09:00:41.684386", "rc": 0, "start": "2024-09-23 09:00:41.681013" } STDOUT: bonding_masters eth0 lo veth0 24971 1727096441.74853: no more pending results, returning what we have 24971 1727096441.74857: results queue empty 24971 1727096441.74859: checking for any_errors_fatal 24971 1727096441.74861: done checking for any_errors_fatal 24971 1727096441.74861: checking for max_fail_percentage 24971 1727096441.74863: done checking for max_fail_percentage 24971 1727096441.74864: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.74865: done checking to see if all hosts have failed 24971 1727096441.74866: getting the remaining hosts for this loop 24971 1727096441.74869: done getting the remaining hosts for this loop 24971 1727096441.74873: getting the next task for host managed_node3 24971 1727096441.74882: done getting next task for host managed_node3 24971 1727096441.74884: ^ task is: TASK: Set current_interfaces 24971 1727096441.74889: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.74895: getting variables 24971 1727096441.74896: in VariableManager get_vars() 24971 1727096441.74940: Calling all_inventory to load vars for managed_node3 24971 1727096441.74943: Calling groups_inventory to load vars for managed_node3 24971 1727096441.74946: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.74958: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.74962: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.74965: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.78107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.81642: done with get_vars() 24971 1727096441.81669: done getting variables 24971 1727096441.81733: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:00:41 -0400 (0:00:00.449) 0:00:29.296 ****** 24971 1727096441.81884: entering _queue_task() for managed_node3/set_fact 24971 1727096441.82578: worker is 1 (out of 1 available) 24971 1727096441.82590: exiting _queue_task() for managed_node3/set_fact 24971 1727096441.82603: done queuing things up, now waiting for results queue to drain 24971 1727096441.82603: waiting for pending results... 24971 1727096441.83316: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 24971 1727096441.83436: in run() - task 0afff68d-5257-3482-6844-00000000071c 24971 1727096441.83740: variable 'ansible_search_path' from source: unknown 24971 1727096441.83744: variable 'ansible_search_path' from source: unknown 24971 1727096441.83748: calling self._execute() 24971 1727096441.83804: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.83816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.83831: variable 'omit' from source: magic vars 24971 1727096441.84642: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.84665: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.84702: variable 'omit' from source: magic vars 24971 1727096441.84833: variable 'omit' from source: magic vars 24971 1727096441.85238: variable '_current_interfaces' from source: set_fact 24971 1727096441.85243: variable 'omit' from source: magic vars 24971 1727096441.85384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096441.85427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096441.85590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096441.85599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.85617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.85653: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096441.85681: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.85704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.85911: Set connection var ansible_shell_type to sh 24971 1727096441.86072: Set connection var ansible_shell_executable to /bin/sh 24971 1727096441.86075: Set connection var ansible_timeout to 10 24971 1727096441.86078: Set connection var ansible_connection to ssh 24971 1727096441.86080: Set connection var ansible_pipelining to False 24971 1727096441.86082: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096441.86084: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.86086: variable 'ansible_connection' from source: unknown 24971 1727096441.86113: variable 'ansible_module_compression' from source: unknown 24971 1727096441.86121: variable 'ansible_shell_type' from source: unknown 24971 1727096441.86131: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.86328: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.86331: variable 'ansible_pipelining' from source: unknown 24971 1727096441.86333: variable 'ansible_timeout' from source: unknown 24971 1727096441.86335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.86499: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096441.86518: variable 'omit' from source: magic vars 24971 1727096441.86554: starting attempt loop 24971 1727096441.86562: running the handler 24971 1727096441.86588: handler run complete 24971 1727096441.86769: attempt loop complete, returning result 24971 1727096441.86772: _execute() done 24971 1727096441.86775: dumping result to json 24971 1727096441.86777: done dumping result, returning 24971 1727096441.86780: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-3482-6844-00000000071c] 24971 1727096441.86782: sending task result for task 0afff68d-5257-3482-6844-00000000071c 24971 1727096441.86855: done sending task result for task 0afff68d-5257-3482-6844-00000000071c 24971 1727096441.86858: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 24971 1727096441.86925: no more pending results, returning what we have 24971 1727096441.86929: results queue empty 24971 1727096441.86930: checking for any_errors_fatal 24971 1727096441.86940: done checking for any_errors_fatal 24971 1727096441.86941: checking for max_fail_percentage 24971 1727096441.86943: done checking for max_fail_percentage 24971 1727096441.86944: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.86945: done checking to see if all hosts have failed 24971 1727096441.86946: getting the remaining hosts for this loop 24971 1727096441.86947: done getting the remaining hosts for this loop 24971 1727096441.86951: getting the next task for host managed_node3 24971 1727096441.86959: done getting next task for host managed_node3 24971 1727096441.86962: ^ task is: TASK: Show current_interfaces 24971 1727096441.86967: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.86972: getting variables 24971 1727096441.86974: in VariableManager get_vars() 24971 1727096441.87014: Calling all_inventory to load vars for managed_node3 24971 1727096441.87017: Calling groups_inventory to load vars for managed_node3 24971 1727096441.87019: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.87031: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.87034: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.87037: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.90066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.91213: done with get_vars() 24971 1727096441.91236: done getting variables 24971 1727096441.91283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:00:41 -0400 (0:00:00.094) 0:00:29.390 ****** 24971 1727096441.91309: entering _queue_task() for managed_node3/debug 24971 1727096441.91566: worker is 1 (out of 1 available) 24971 1727096441.91580: exiting _queue_task() for managed_node3/debug 24971 1727096441.91592: done queuing things up, now waiting for results queue to drain 24971 1727096441.91593: waiting for pending results... 24971 1727096441.91787: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 24971 1727096441.91871: in run() - task 0afff68d-5257-3482-6844-0000000006e5 24971 1727096441.91884: variable 'ansible_search_path' from source: unknown 24971 1727096441.91888: variable 'ansible_search_path' from source: unknown 24971 1727096441.91918: calling self._execute() 24971 1727096441.92003: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.92006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.92009: variable 'omit' from source: magic vars 24971 1727096441.92343: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.92364: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.92368: variable 'omit' from source: magic vars 24971 1727096441.92415: variable 'omit' from source: magic vars 24971 1727096441.92533: variable 'current_interfaces' from source: set_fact 24971 1727096441.92545: variable 'omit' from source: magic vars 24971 1727096441.92676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096441.92680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096441.92683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096441.92702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.92723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096441.92833: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096441.92836: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.92839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.92859: Set connection var ansible_shell_type to sh 24971 1727096441.92871: Set connection var ansible_shell_executable to /bin/sh 24971 1727096441.92883: Set connection var ansible_timeout to 10 24971 1727096441.92888: Set connection var ansible_connection to ssh 24971 1727096441.92894: Set connection var ansible_pipelining to False 24971 1727096441.92899: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096441.92923: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.92927: variable 'ansible_connection' from source: unknown 24971 1727096441.92929: variable 'ansible_module_compression' from source: unknown 24971 1727096441.92970: variable 'ansible_shell_type' from source: unknown 24971 1727096441.92975: variable 'ansible_shell_executable' from source: unknown 24971 1727096441.92978: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.92981: variable 'ansible_pipelining' from source: unknown 24971 1727096441.92983: variable 'ansible_timeout' from source: unknown 24971 1727096441.92985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.93278: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096441.93282: variable 'omit' from source: magic vars 24971 1727096441.93284: starting attempt loop 24971 1727096441.93285: running the handler 24971 1727096441.93287: handler run complete 24971 1727096441.93288: attempt loop complete, returning result 24971 1727096441.93290: _execute() done 24971 1727096441.93291: dumping result to json 24971 1727096441.93293: done dumping result, returning 24971 1727096441.93295: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-3482-6844-0000000006e5] 24971 1727096441.93296: sending task result for task 0afff68d-5257-3482-6844-0000000006e5 24971 1727096441.93353: done sending task result for task 0afff68d-5257-3482-6844-0000000006e5 24971 1727096441.93355: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 24971 1727096441.93424: no more pending results, returning what we have 24971 1727096441.93427: results queue empty 24971 1727096441.93428: checking for any_errors_fatal 24971 1727096441.93433: done checking for any_errors_fatal 24971 1727096441.93433: checking for max_fail_percentage 24971 1727096441.93435: done checking for max_fail_percentage 24971 1727096441.93436: checking to see if all hosts have failed and the running result is not ok 24971 1727096441.93437: done checking to see if all hosts have failed 24971 1727096441.93437: getting the remaining hosts for this loop 24971 1727096441.93438: done getting the remaining hosts for this loop 24971 1727096441.93442: getting the next task for host managed_node3 24971 1727096441.93450: done getting next task for host managed_node3 24971 1727096441.93452: ^ task is: TASK: Install iproute 24971 1727096441.93455: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096441.93458: getting variables 24971 1727096441.93459: in VariableManager get_vars() 24971 1727096441.93585: Calling all_inventory to load vars for managed_node3 24971 1727096441.93592: Calling groups_inventory to load vars for managed_node3 24971 1727096441.93595: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096441.93604: Calling all_plugins_play to load vars for managed_node3 24971 1727096441.93607: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096441.93610: Calling groups_plugins_play to load vars for managed_node3 24971 1727096441.94833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096441.95683: done with get_vars() 24971 1727096441.95701: done getting variables 24971 1727096441.95744: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:00:41 -0400 (0:00:00.044) 0:00:29.435 ****** 24971 1727096441.95771: entering _queue_task() for managed_node3/package 24971 1727096441.96029: worker is 1 (out of 1 available) 24971 1727096441.96040: exiting _queue_task() for managed_node3/package 24971 1727096441.96053: done queuing things up, now waiting for results queue to drain 24971 1727096441.96054: waiting for pending results... 24971 1727096441.96230: running TaskExecutor() for managed_node3/TASK: Install iproute 24971 1727096441.96303: in run() - task 0afff68d-5257-3482-6844-0000000005cf 24971 1727096441.96314: variable 'ansible_search_path' from source: unknown 24971 1727096441.96317: variable 'ansible_search_path' from source: unknown 24971 1727096441.96351: calling self._execute() 24971 1727096441.96575: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096441.96578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096441.96580: variable 'omit' from source: magic vars 24971 1727096441.96852: variable 'ansible_distribution_major_version' from source: facts 24971 1727096441.96875: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096441.96888: variable 'omit' from source: magic vars 24971 1727096441.96932: variable 'omit' from source: magic vars 24971 1727096441.97125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24971 1727096441.99334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24971 1727096441.99408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24971 1727096441.99452: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24971 1727096441.99498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24971 1727096441.99531: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24971 1727096441.99638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24971 1727096441.99692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24971 1727096441.99722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24971 1727096441.99766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24971 1727096441.99790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24971 1727096441.99900: variable '__network_is_ostree' from source: set_fact 24971 1727096441.99910: variable 'omit' from source: magic vars 24971 1727096441.99946: variable 'omit' from source: magic vars 24971 1727096441.99985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096442.00375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096442.00378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096442.00381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096442.00384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096442.00386: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096442.00388: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.00391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.00791: Set connection var ansible_shell_type to sh 24971 1727096442.00794: Set connection var ansible_shell_executable to /bin/sh 24971 1727096442.00797: Set connection var ansible_timeout to 10 24971 1727096442.00799: Set connection var ansible_connection to ssh 24971 1727096442.00801: Set connection var ansible_pipelining to False 24971 1727096442.00802: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096442.00805: variable 'ansible_shell_executable' from source: unknown 24971 1727096442.00806: variable 'ansible_connection' from source: unknown 24971 1727096442.00808: variable 'ansible_module_compression' from source: unknown 24971 1727096442.00810: variable 'ansible_shell_type' from source: unknown 24971 1727096442.00812: variable 'ansible_shell_executable' from source: unknown 24971 1727096442.00813: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.00815: variable 'ansible_pipelining' from source: unknown 24971 1727096442.00817: variable 'ansible_timeout' from source: unknown 24971 1727096442.00818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.00954: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096442.01089: variable 'omit' from source: magic vars 24971 1727096442.01100: starting attempt loop 24971 1727096442.01107: running the handler 24971 1727096442.01120: variable 'ansible_facts' from source: unknown 24971 1727096442.01127: variable 'ansible_facts' from source: unknown 24971 1727096442.01166: _low_level_execute_command(): starting 24971 1727096442.01506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096442.02253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096442.02276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096442.02292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096442.02311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096442.02330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096442.02341: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096442.02354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.02376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096442.02387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096442.02396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096442.02405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096442.02415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096442.02427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096442.02504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096442.02690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096442.02888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.04715: stdout chunk (state=3): >>>/root <<< 24971 1727096442.04795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096442.04799: stdout chunk (state=3): >>><<< 24971 1727096442.04801: stderr chunk (state=3): >>><<< 24971 1727096442.04819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096442.04837: _low_level_execute_command(): starting 24971 1727096442.04846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464 `" && echo ansible-tmp-1727096442.048263-26251-153065260668464="` echo /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464 `" ) && sleep 0' 24971 1727096442.06088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.08008: stdout chunk (state=3): >>>ansible-tmp-1727096442.048263-26251-153065260668464=/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464 <<< 24971 1727096442.08376: stdout chunk (state=3): >>><<< 24971 1727096442.08381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096442.08384: stderr chunk (state=3): >>><<< 24971 1727096442.08387: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096442.048263-26251-153065260668464=/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096442.08389: variable 'ansible_module_compression' from source: unknown 24971 1727096442.08391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 24971 1727096442.08394: variable 'ansible_facts' from source: unknown 24971 1727096442.08686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py 24971 1727096442.09022: Sending initial data 24971 1727096442.09026: Sent initial data (151 bytes) 24971 1727096442.10074: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096442.10363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.10401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096442.10418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096442.10483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.12215: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096442.12328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096442.12375: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmphox9rubh /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py <<< 24971 1727096442.12385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py" <<< 24971 1727096442.12410: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmphox9rubh" to remote "/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py" <<< 24971 1727096442.14991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096442.15377: stderr chunk (state=3): >>><<< 24971 1727096442.15380: stdout chunk (state=3): >>><<< 24971 1727096442.15383: done transferring module to remote 24971 1727096442.15385: _low_level_execute_command(): starting 24971 1727096442.15387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/ /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py && sleep 0' 24971 1727096442.16731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096442.16998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.17057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096442.17264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096442.17475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.19162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096442.19224: stderr chunk (state=3): >>><<< 24971 1727096442.19234: stdout chunk (state=3): >>><<< 24971 1727096442.19258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096442.19307: _low_level_execute_command(): starting 24971 1727096442.19322: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/AnsiballZ_dnf.py && sleep 0' 24971 1727096442.19929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096442.19945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096442.19962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096442.19985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096442.20004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096442.20016: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096442.20030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.20091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.20140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096442.20157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096442.20187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096442.20265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.62783: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24971 1727096442.67060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096442.67538: stdout chunk (state=3): >>><<< 24971 1727096442.67541: stderr chunk (state=3): >>><<< 24971 1727096442.67545: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096442.67553: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096442.67556: _low_level_execute_command(): starting 24971 1727096442.67558: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096442.048263-26251-153065260668464/ > /dev/null 2>&1 && sleep 0' 24971 1727096442.68302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096442.68314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096442.68327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096442.68344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096442.68359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096442.68375: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096442.68391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.68411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096442.68483: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096442.68507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096442.68530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096442.68545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096442.68608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096442.70425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096442.70676: stderr chunk (state=3): >>><<< 24971 1727096442.70679: stdout chunk (state=3): >>><<< 24971 1727096442.70682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096442.70684: handler run complete 24971 1727096442.71087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24971 1727096442.71372: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24971 1727096442.71453: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24971 1727096442.71553: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24971 1727096442.71662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24971 1727096442.71850: variable '__install_status' from source: set_fact 24971 1727096442.71878: Evaluated conditional (__install_status is success): True 24971 1727096442.71899: attempt loop complete, returning result 24971 1727096442.71906: _execute() done 24971 1727096442.71911: dumping result to json 24971 1727096442.71922: done dumping result, returning 24971 1727096442.71933: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-3482-6844-0000000005cf] 24971 1727096442.71941: sending task result for task 0afff68d-5257-3482-6844-0000000005cf ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24971 1727096442.72173: no more pending results, returning what we have 24971 1727096442.72178: results queue empty 24971 1727096442.72179: checking for any_errors_fatal 24971 1727096442.72185: done checking for any_errors_fatal 24971 1727096442.72186: checking for max_fail_percentage 24971 1727096442.72188: done checking for max_fail_percentage 24971 1727096442.72189: checking to see if all hosts have failed and the running result is not ok 24971 1727096442.72190: done checking to see if all hosts have failed 24971 1727096442.72191: getting the remaining hosts for this loop 24971 1727096442.72193: done getting the remaining hosts for this loop 24971 1727096442.72197: getting the next task for host managed_node3 24971 1727096442.72203: done getting next task for host managed_node3 24971 1727096442.72205: ^ task is: TASK: Create veth interface {{ interface }} 24971 1727096442.72208: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096442.72212: getting variables 24971 1727096442.72213: in VariableManager get_vars() 24971 1727096442.72253: Calling all_inventory to load vars for managed_node3 24971 1727096442.72255: Calling groups_inventory to load vars for managed_node3 24971 1727096442.72257: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096442.72270: Calling all_plugins_play to load vars for managed_node3 24971 1727096442.72273: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096442.72276: Calling groups_plugins_play to load vars for managed_node3 24971 1727096442.73509: done sending task result for task 0afff68d-5257-3482-6844-0000000005cf 24971 1727096442.73513: WORKER PROCESS EXITING 24971 1727096442.74815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096442.77943: done with get_vars() 24971 1727096442.77984: done getting variables 24971 1727096442.78048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096442.78375: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:00:42 -0400 (0:00:00.826) 0:00:30.261 ****** 24971 1727096442.78409: entering _queue_task() for managed_node3/command 24971 1727096442.79165: worker is 1 (out of 1 available) 24971 1727096442.79376: exiting _queue_task() for managed_node3/command 24971 1727096442.79387: done queuing things up, now waiting for results queue to drain 24971 1727096442.79388: waiting for pending results... 24971 1727096442.79716: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 24971 1727096442.79918: in run() - task 0afff68d-5257-3482-6844-0000000005d0 24971 1727096442.79937: variable 'ansible_search_path' from source: unknown 24971 1727096442.80076: variable 'ansible_search_path' from source: unknown 24971 1727096442.80445: variable 'interface' from source: play vars 24971 1727096442.80775: variable 'interface' from source: play vars 24971 1727096442.80827: variable 'interface' from source: play vars 24971 1727096442.81176: Loaded config def from plugin (lookup/items) 24971 1727096442.81190: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 24971 1727096442.81437: variable 'omit' from source: magic vars 24971 1727096442.81588: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.81604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.81621: variable 'omit' from source: magic vars 24971 1727096442.82075: variable 'ansible_distribution_major_version' from source: facts 24971 1727096442.82377: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096442.82540: variable 'type' from source: play vars 24971 1727096442.82590: variable 'state' from source: include params 24971 1727096442.82601: variable 'interface' from source: play vars 24971 1727096442.82607: variable 'current_interfaces' from source: set_fact 24971 1727096442.82617: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 24971 1727096442.82622: when evaluation is False, skipping this task 24971 1727096442.82651: variable 'item' from source: unknown 24971 1727096442.82775: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 24971 1727096442.83142: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.83145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.83149: variable 'omit' from source: magic vars 24971 1727096442.83677: variable 'ansible_distribution_major_version' from source: facts 24971 1727096442.84175: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096442.84674: variable 'type' from source: play vars 24971 1727096442.84677: variable 'state' from source: include params 24971 1727096442.84680: variable 'interface' from source: play vars 24971 1727096442.84682: variable 'current_interfaces' from source: set_fact 24971 1727096442.84685: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 24971 1727096442.84687: when evaluation is False, skipping this task 24971 1727096442.84689: variable 'item' from source: unknown 24971 1727096442.84691: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 24971 1727096442.85175: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.85178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.85181: variable 'omit' from source: magic vars 24971 1727096442.85183: variable 'ansible_distribution_major_version' from source: facts 24971 1727096442.85185: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096442.85542: variable 'type' from source: play vars 24971 1727096442.85975: variable 'state' from source: include params 24971 1727096442.85979: variable 'interface' from source: play vars 24971 1727096442.85981: variable 'current_interfaces' from source: set_fact 24971 1727096442.85984: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 24971 1727096442.85987: when evaluation is False, skipping this task 24971 1727096442.85989: variable 'item' from source: unknown 24971 1727096442.85991: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 24971 1727096442.86050: dumping result to json 24971 1727096442.86053: done dumping result, returning 24971 1727096442.86055: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-3482-6844-0000000005d0] 24971 1727096442.86062: sending task result for task 0afff68d-5257-3482-6844-0000000005d0 24971 1727096442.86106: done sending task result for task 0afff68d-5257-3482-6844-0000000005d0 24971 1727096442.86110: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 24971 1727096442.86207: no more pending results, returning what we have 24971 1727096442.86211: results queue empty 24971 1727096442.86212: checking for any_errors_fatal 24971 1727096442.86220: done checking for any_errors_fatal 24971 1727096442.86221: checking for max_fail_percentage 24971 1727096442.86223: done checking for max_fail_percentage 24971 1727096442.86223: checking to see if all hosts have failed and the running result is not ok 24971 1727096442.86224: done checking to see if all hosts have failed 24971 1727096442.86225: getting the remaining hosts for this loop 24971 1727096442.86226: done getting the remaining hosts for this loop 24971 1727096442.86230: getting the next task for host managed_node3 24971 1727096442.86236: done getting next task for host managed_node3 24971 1727096442.86239: ^ task is: TASK: Set up veth as managed by NetworkManager 24971 1727096442.86242: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096442.86247: getting variables 24971 1727096442.86249: in VariableManager get_vars() 24971 1727096442.86293: Calling all_inventory to load vars for managed_node3 24971 1727096442.86296: Calling groups_inventory to load vars for managed_node3 24971 1727096442.86298: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096442.86310: Calling all_plugins_play to load vars for managed_node3 24971 1727096442.86313: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096442.86315: Calling groups_plugins_play to load vars for managed_node3 24971 1727096442.89265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096442.92281: done with get_vars() 24971 1727096442.92310: done getting variables 24971 1727096442.92572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:00:42 -0400 (0:00:00.141) 0:00:30.403 ****** 24971 1727096442.92610: entering _queue_task() for managed_node3/command 24971 1727096442.93084: worker is 1 (out of 1 available) 24971 1727096442.93099: exiting _queue_task() for managed_node3/command 24971 1727096442.93113: done queuing things up, now waiting for results queue to drain 24971 1727096442.93114: waiting for pending results... 24971 1727096442.93389: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 24971 1727096442.93475: in run() - task 0afff68d-5257-3482-6844-0000000005d1 24971 1727096442.93492: variable 'ansible_search_path' from source: unknown 24971 1727096442.93496: variable 'ansible_search_path' from source: unknown 24971 1727096442.93530: calling self._execute() 24971 1727096442.93628: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.93632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.93642: variable 'omit' from source: magic vars 24971 1727096442.94012: variable 'ansible_distribution_major_version' from source: facts 24971 1727096442.94019: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096442.94205: variable 'type' from source: play vars 24971 1727096442.94209: variable 'state' from source: include params 24971 1727096442.94212: Evaluated conditional (type == 'veth' and state == 'present'): False 24971 1727096442.94215: when evaluation is False, skipping this task 24971 1727096442.94217: _execute() done 24971 1727096442.94220: dumping result to json 24971 1727096442.94229: done dumping result, returning 24971 1727096442.94232: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-3482-6844-0000000005d1] 24971 1727096442.94234: sending task result for task 0afff68d-5257-3482-6844-0000000005d1 24971 1727096442.94328: done sending task result for task 0afff68d-5257-3482-6844-0000000005d1 24971 1727096442.94448: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 24971 1727096442.94507: no more pending results, returning what we have 24971 1727096442.94511: results queue empty 24971 1727096442.94512: checking for any_errors_fatal 24971 1727096442.94525: done checking for any_errors_fatal 24971 1727096442.94526: checking for max_fail_percentage 24971 1727096442.94528: done checking for max_fail_percentage 24971 1727096442.94529: checking to see if all hosts have failed and the running result is not ok 24971 1727096442.94530: done checking to see if all hosts have failed 24971 1727096442.94531: getting the remaining hosts for this loop 24971 1727096442.94532: done getting the remaining hosts for this loop 24971 1727096442.94536: getting the next task for host managed_node3 24971 1727096442.94543: done getting next task for host managed_node3 24971 1727096442.94546: ^ task is: TASK: Delete veth interface {{ interface }} 24971 1727096442.94549: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096442.94554: getting variables 24971 1727096442.94555: in VariableManager get_vars() 24971 1727096442.94603: Calling all_inventory to load vars for managed_node3 24971 1727096442.94606: Calling groups_inventory to load vars for managed_node3 24971 1727096442.94609: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096442.94624: Calling all_plugins_play to load vars for managed_node3 24971 1727096442.94627: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096442.94631: Calling groups_plugins_play to load vars for managed_node3 24971 1727096442.96173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096442.97823: done with get_vars() 24971 1727096442.97851: done getting variables 24971 1727096442.97914: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096442.98044: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:00:42 -0400 (0:00:00.054) 0:00:30.458 ****** 24971 1727096442.98087: entering _queue_task() for managed_node3/command 24971 1727096442.98345: worker is 1 (out of 1 available) 24971 1727096442.98358: exiting _queue_task() for managed_node3/command 24971 1727096442.98375: done queuing things up, now waiting for results queue to drain 24971 1727096442.98377: waiting for pending results... 24971 1727096442.98543: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 24971 1727096442.98613: in run() - task 0afff68d-5257-3482-6844-0000000005d2 24971 1727096442.98625: variable 'ansible_search_path' from source: unknown 24971 1727096442.98628: variable 'ansible_search_path' from source: unknown 24971 1727096442.98657: calling self._execute() 24971 1727096442.98736: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.98739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.98749: variable 'omit' from source: magic vars 24971 1727096442.99181: variable 'ansible_distribution_major_version' from source: facts 24971 1727096442.99185: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096442.99289: variable 'type' from source: play vars 24971 1727096442.99293: variable 'state' from source: include params 24971 1727096442.99298: variable 'interface' from source: play vars 24971 1727096442.99302: variable 'current_interfaces' from source: set_fact 24971 1727096442.99310: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 24971 1727096442.99317: variable 'omit' from source: magic vars 24971 1727096442.99355: variable 'omit' from source: magic vars 24971 1727096442.99446: variable 'interface' from source: play vars 24971 1727096442.99465: variable 'omit' from source: magic vars 24971 1727096442.99525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096442.99539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096442.99559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096442.99578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096442.99589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096442.99650: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096442.99654: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.99656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096442.99831: Set connection var ansible_shell_type to sh 24971 1727096442.99835: Set connection var ansible_shell_executable to /bin/sh 24971 1727096442.99837: Set connection var ansible_timeout to 10 24971 1727096442.99840: Set connection var ansible_connection to ssh 24971 1727096442.99842: Set connection var ansible_pipelining to False 24971 1727096442.99844: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096442.99847: variable 'ansible_shell_executable' from source: unknown 24971 1727096442.99849: variable 'ansible_connection' from source: unknown 24971 1727096442.99860: variable 'ansible_module_compression' from source: unknown 24971 1727096442.99871: variable 'ansible_shell_type' from source: unknown 24971 1727096442.99965: variable 'ansible_shell_executable' from source: unknown 24971 1727096442.99974: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096442.99977: variable 'ansible_pipelining' from source: unknown 24971 1727096442.99979: variable 'ansible_timeout' from source: unknown 24971 1727096442.99981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.00057: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096443.00084: variable 'omit' from source: magic vars 24971 1727096443.00100: starting attempt loop 24971 1727096443.00106: running the handler 24971 1727096443.00125: _low_level_execute_command(): starting 24971 1727096443.00138: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096443.01111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.01115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.01119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096443.01194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.01198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.01201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.01204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.01242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.01303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.02989: stdout chunk (state=3): >>>/root <<< 24971 1727096443.03153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.03156: stdout chunk (state=3): >>><<< 24971 1727096443.03163: stderr chunk (state=3): >>><<< 24971 1727096443.03315: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.03323: _low_level_execute_command(): starting 24971 1727096443.03326: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787 `" && echo ansible-tmp-1727096443.0319023-26309-47145958871787="` echo /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787 `" ) && sleep 0' 24971 1727096443.04116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.04121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.04147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.04165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.04193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.04266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.06155: stdout chunk (state=3): >>>ansible-tmp-1727096443.0319023-26309-47145958871787=/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787 <<< 24971 1727096443.06290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.06296: stderr chunk (state=3): >>><<< 24971 1727096443.06301: stdout chunk (state=3): >>><<< 24971 1727096443.06318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096443.0319023-26309-47145958871787=/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.06343: variable 'ansible_module_compression' from source: unknown 24971 1727096443.06388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096443.06419: variable 'ansible_facts' from source: unknown 24971 1727096443.06475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py 24971 1727096443.06653: Sending initial data 24971 1727096443.06656: Sent initial data (155 bytes) 24971 1727096443.07993: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.08021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.08035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.08058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.08159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.09733: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096443.09761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096443.09800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpwmqyi3_4 /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py <<< 24971 1727096443.09803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py" <<< 24971 1727096443.09828: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpwmqyi3_4" to remote "/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py" <<< 24971 1727096443.10373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.10518: stderr chunk (state=3): >>><<< 24971 1727096443.10521: stdout chunk (state=3): >>><<< 24971 1727096443.10523: done transferring module to remote 24971 1727096443.10526: _low_level_execute_command(): starting 24971 1727096443.10528: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/ /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py && sleep 0' 24971 1727096443.11022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096443.11035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.11050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.11069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096443.11124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096443.11158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.11234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.11238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.11277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.13055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.13077: stderr chunk (state=3): >>><<< 24971 1727096443.13081: stdout chunk (state=3): >>><<< 24971 1727096443.13095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.13098: _low_level_execute_command(): starting 24971 1727096443.13103: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/AnsiballZ_command.py && sleep 0' 24971 1727096443.13547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.13550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 24971 1727096443.13553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.13555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.13557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.13611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.13614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.13656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.30230: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 09:00:43.285994", "end": "2024-09-23 09:00:43.297790", "delta": "0:00:00.011796", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096443.32726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096443.32729: stderr chunk (state=3): >>><<< 24971 1727096443.32731: stdout chunk (state=3): >>><<< 24971 1727096443.32733: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 09:00:43.285994", "end": "2024-09-23 09:00:43.297790", "delta": "0:00:00.011796", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096443.32826: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096443.32835: _low_level_execute_command(): starting 24971 1727096443.32840: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096443.0319023-26309-47145958871787/ > /dev/null 2>&1 && sleep 0' 24971 1727096443.34177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.34181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.34286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.34401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.34408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.34589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.34766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.36977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.36982: stdout chunk (state=3): >>><<< 24971 1727096443.36984: stderr chunk (state=3): >>><<< 24971 1727096443.36987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.36990: handler run complete 24971 1727096443.36993: Evaluated conditional (False): False 24971 1727096443.36996: attempt loop complete, returning result 24971 1727096443.36998: _execute() done 24971 1727096443.37001: dumping result to json 24971 1727096443.37004: done dumping result, returning 24971 1727096443.37005: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-3482-6844-0000000005d2] 24971 1727096443.37008: sending task result for task 0afff68d-5257-3482-6844-0000000005d2 24971 1727096443.37087: done sending task result for task 0afff68d-5257-3482-6844-0000000005d2 24971 1727096443.37091: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.011796", "end": "2024-09-23 09:00:43.297790", "rc": 0, "start": "2024-09-23 09:00:43.285994" } 24971 1727096443.37161: no more pending results, returning what we have 24971 1727096443.37165: results queue empty 24971 1727096443.37166: checking for any_errors_fatal 24971 1727096443.37178: done checking for any_errors_fatal 24971 1727096443.37179: checking for max_fail_percentage 24971 1727096443.37181: done checking for max_fail_percentage 24971 1727096443.37182: checking to see if all hosts have failed and the running result is not ok 24971 1727096443.37183: done checking to see if all hosts have failed 24971 1727096443.37184: getting the remaining hosts for this loop 24971 1727096443.37186: done getting the remaining hosts for this loop 24971 1727096443.37189: getting the next task for host managed_node3 24971 1727096443.37196: done getting next task for host managed_node3 24971 1727096443.37199: ^ task is: TASK: Create dummy interface {{ interface }} 24971 1727096443.37203: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096443.37208: getting variables 24971 1727096443.37209: in VariableManager get_vars() 24971 1727096443.37254: Calling all_inventory to load vars for managed_node3 24971 1727096443.37258: Calling groups_inventory to load vars for managed_node3 24971 1727096443.37260: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096443.37387: Calling all_plugins_play to load vars for managed_node3 24971 1727096443.37392: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096443.37395: Calling groups_plugins_play to load vars for managed_node3 24971 1727096443.40662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096443.42649: done with get_vars() 24971 1727096443.42683: done getting variables 24971 1727096443.42750: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096443.42866: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:00:43 -0400 (0:00:00.448) 0:00:30.906 ****** 24971 1727096443.42900: entering _queue_task() for managed_node3/command 24971 1727096443.43410: worker is 1 (out of 1 available) 24971 1727096443.43421: exiting _queue_task() for managed_node3/command 24971 1727096443.43432: done queuing things up, now waiting for results queue to drain 24971 1727096443.43433: waiting for pending results... 24971 1727096443.44111: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 24971 1727096443.44324: in run() - task 0afff68d-5257-3482-6844-0000000005d3 24971 1727096443.44336: variable 'ansible_search_path' from source: unknown 24971 1727096443.44340: variable 'ansible_search_path' from source: unknown 24971 1727096443.44411: calling self._execute() 24971 1727096443.44511: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.44693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.44710: variable 'omit' from source: magic vars 24971 1727096443.45311: variable 'ansible_distribution_major_version' from source: facts 24971 1727096443.45314: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096443.45521: variable 'type' from source: play vars 24971 1727096443.45525: variable 'state' from source: include params 24971 1727096443.45528: variable 'interface' from source: play vars 24971 1727096443.45533: variable 'current_interfaces' from source: set_fact 24971 1727096443.45542: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 24971 1727096443.45544: when evaluation is False, skipping this task 24971 1727096443.45547: _execute() done 24971 1727096443.45550: dumping result to json 24971 1727096443.45560: done dumping result, returning 24971 1727096443.45571: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-3482-6844-0000000005d3] 24971 1727096443.45574: sending task result for task 0afff68d-5257-3482-6844-0000000005d3 24971 1727096443.45774: done sending task result for task 0afff68d-5257-3482-6844-0000000005d3 24971 1727096443.45778: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096443.45825: no more pending results, returning what we have 24971 1727096443.45830: results queue empty 24971 1727096443.45831: checking for any_errors_fatal 24971 1727096443.45839: done checking for any_errors_fatal 24971 1727096443.45840: checking for max_fail_percentage 24971 1727096443.45842: done checking for max_fail_percentage 24971 1727096443.45843: checking to see if all hosts have failed and the running result is not ok 24971 1727096443.45844: done checking to see if all hosts have failed 24971 1727096443.45846: getting the remaining hosts for this loop 24971 1727096443.45847: done getting the remaining hosts for this loop 24971 1727096443.45851: getting the next task for host managed_node3 24971 1727096443.45858: done getting next task for host managed_node3 24971 1727096443.45861: ^ task is: TASK: Delete dummy interface {{ interface }} 24971 1727096443.45865: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096443.45871: getting variables 24971 1727096443.45873: in VariableManager get_vars() 24971 1727096443.45919: Calling all_inventory to load vars for managed_node3 24971 1727096443.45922: Calling groups_inventory to load vars for managed_node3 24971 1727096443.45924: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096443.45937: Calling all_plugins_play to load vars for managed_node3 24971 1727096443.45941: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096443.45944: Calling groups_plugins_play to load vars for managed_node3 24971 1727096443.47539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096443.54688: done with get_vars() 24971 1727096443.54714: done getting variables 24971 1727096443.54760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096443.54856: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:00:43 -0400 (0:00:00.119) 0:00:31.026 ****** 24971 1727096443.54886: entering _queue_task() for managed_node3/command 24971 1727096443.55241: worker is 1 (out of 1 available) 24971 1727096443.55254: exiting _queue_task() for managed_node3/command 24971 1727096443.55272: done queuing things up, now waiting for results queue to drain 24971 1727096443.55274: waiting for pending results... 24971 1727096443.55576: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 24971 1727096443.55701: in run() - task 0afff68d-5257-3482-6844-0000000005d4 24971 1727096443.55723: variable 'ansible_search_path' from source: unknown 24971 1727096443.55730: variable 'ansible_search_path' from source: unknown 24971 1727096443.55779: calling self._execute() 24971 1727096443.55888: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.55898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.55913: variable 'omit' from source: magic vars 24971 1727096443.56314: variable 'ansible_distribution_major_version' from source: facts 24971 1727096443.56332: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096443.56503: variable 'type' from source: play vars 24971 1727096443.56506: variable 'state' from source: include params 24971 1727096443.56510: variable 'interface' from source: play vars 24971 1727096443.56515: variable 'current_interfaces' from source: set_fact 24971 1727096443.56525: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 24971 1727096443.56528: when evaluation is False, skipping this task 24971 1727096443.56530: _execute() done 24971 1727096443.56533: dumping result to json 24971 1727096443.56535: done dumping result, returning 24971 1727096443.56538: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-3482-6844-0000000005d4] 24971 1727096443.56544: sending task result for task 0afff68d-5257-3482-6844-0000000005d4 24971 1727096443.56633: done sending task result for task 0afff68d-5257-3482-6844-0000000005d4 24971 1727096443.56635: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096443.56694: no more pending results, returning what we have 24971 1727096443.56698: results queue empty 24971 1727096443.56698: checking for any_errors_fatal 24971 1727096443.56704: done checking for any_errors_fatal 24971 1727096443.56705: checking for max_fail_percentage 24971 1727096443.56706: done checking for max_fail_percentage 24971 1727096443.56707: checking to see if all hosts have failed and the running result is not ok 24971 1727096443.56708: done checking to see if all hosts have failed 24971 1727096443.56708: getting the remaining hosts for this loop 24971 1727096443.56710: done getting the remaining hosts for this loop 24971 1727096443.56713: getting the next task for host managed_node3 24971 1727096443.56719: done getting next task for host managed_node3 24971 1727096443.56722: ^ task is: TASK: Create tap interface {{ interface }} 24971 1727096443.56725: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096443.56730: getting variables 24971 1727096443.56731: in VariableManager get_vars() 24971 1727096443.56773: Calling all_inventory to load vars for managed_node3 24971 1727096443.56782: Calling groups_inventory to load vars for managed_node3 24971 1727096443.56785: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096443.56795: Calling all_plugins_play to load vars for managed_node3 24971 1727096443.56797: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096443.56800: Calling groups_plugins_play to load vars for managed_node3 24971 1727096443.57737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096443.59313: done with get_vars() 24971 1727096443.59337: done getting variables 24971 1727096443.59402: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096443.59518: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:00:43 -0400 (0:00:00.046) 0:00:31.073 ****** 24971 1727096443.59549: entering _queue_task() for managed_node3/command 24971 1727096443.59892: worker is 1 (out of 1 available) 24971 1727096443.59904: exiting _queue_task() for managed_node3/command 24971 1727096443.59917: done queuing things up, now waiting for results queue to drain 24971 1727096443.59918: waiting for pending results... 24971 1727096443.60286: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 24971 1727096443.60313: in run() - task 0afff68d-5257-3482-6844-0000000005d5 24971 1727096443.60327: variable 'ansible_search_path' from source: unknown 24971 1727096443.60331: variable 'ansible_search_path' from source: unknown 24971 1727096443.60365: calling self._execute() 24971 1727096443.60470: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.60478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.60489: variable 'omit' from source: magic vars 24971 1727096443.60871: variable 'ansible_distribution_major_version' from source: facts 24971 1727096443.60961: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096443.61099: variable 'type' from source: play vars 24971 1727096443.61103: variable 'state' from source: include params 24971 1727096443.61108: variable 'interface' from source: play vars 24971 1727096443.61112: variable 'current_interfaces' from source: set_fact 24971 1727096443.61121: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 24971 1727096443.61124: when evaluation is False, skipping this task 24971 1727096443.61126: _execute() done 24971 1727096443.61128: dumping result to json 24971 1727096443.61131: done dumping result, returning 24971 1727096443.61137: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-3482-6844-0000000005d5] 24971 1727096443.61171: sending task result for task 0afff68d-5257-3482-6844-0000000005d5 24971 1727096443.61242: done sending task result for task 0afff68d-5257-3482-6844-0000000005d5 24971 1727096443.61244: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096443.61317: no more pending results, returning what we have 24971 1727096443.61321: results queue empty 24971 1727096443.61322: checking for any_errors_fatal 24971 1727096443.61327: done checking for any_errors_fatal 24971 1727096443.61328: checking for max_fail_percentage 24971 1727096443.61330: done checking for max_fail_percentage 24971 1727096443.61330: checking to see if all hosts have failed and the running result is not ok 24971 1727096443.61331: done checking to see if all hosts have failed 24971 1727096443.61332: getting the remaining hosts for this loop 24971 1727096443.61333: done getting the remaining hosts for this loop 24971 1727096443.61337: getting the next task for host managed_node3 24971 1727096443.61343: done getting next task for host managed_node3 24971 1727096443.61346: ^ task is: TASK: Delete tap interface {{ interface }} 24971 1727096443.61350: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096443.61356: getting variables 24971 1727096443.61357: in VariableManager get_vars() 24971 1727096443.61404: Calling all_inventory to load vars for managed_node3 24971 1727096443.61407: Calling groups_inventory to load vars for managed_node3 24971 1727096443.61410: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096443.61425: Calling all_plugins_play to load vars for managed_node3 24971 1727096443.61429: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096443.61433: Calling groups_plugins_play to load vars for managed_node3 24971 1727096443.63097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096443.64648: done with get_vars() 24971 1727096443.64676: done getting variables 24971 1727096443.64733: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24971 1727096443.64847: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:00:43 -0400 (0:00:00.053) 0:00:31.126 ****** 24971 1727096443.64884: entering _queue_task() for managed_node3/command 24971 1727096443.65228: worker is 1 (out of 1 available) 24971 1727096443.65240: exiting _queue_task() for managed_node3/command 24971 1727096443.65253: done queuing things up, now waiting for results queue to drain 24971 1727096443.65254: waiting for pending results... 24971 1727096443.65631: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 24971 1727096443.65638: in run() - task 0afff68d-5257-3482-6844-0000000005d6 24971 1727096443.65653: variable 'ansible_search_path' from source: unknown 24971 1727096443.65656: variable 'ansible_search_path' from source: unknown 24971 1727096443.65703: calling self._execute() 24971 1727096443.65839: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.65843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.65847: variable 'omit' from source: magic vars 24971 1727096443.66203: variable 'ansible_distribution_major_version' from source: facts 24971 1727096443.66216: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096443.66472: variable 'type' from source: play vars 24971 1727096443.66476: variable 'state' from source: include params 24971 1727096443.66479: variable 'interface' from source: play vars 24971 1727096443.66482: variable 'current_interfaces' from source: set_fact 24971 1727096443.66485: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 24971 1727096443.66487: when evaluation is False, skipping this task 24971 1727096443.66491: _execute() done 24971 1727096443.66492: dumping result to json 24971 1727096443.66494: done dumping result, returning 24971 1727096443.66497: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-3482-6844-0000000005d6] 24971 1727096443.66499: sending task result for task 0afff68d-5257-3482-6844-0000000005d6 24971 1727096443.66778: done sending task result for task 0afff68d-5257-3482-6844-0000000005d6 24971 1727096443.66782: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24971 1727096443.66866: no more pending results, returning what we have 24971 1727096443.66874: results queue empty 24971 1727096443.66875: checking for any_errors_fatal 24971 1727096443.66879: done checking for any_errors_fatal 24971 1727096443.66880: checking for max_fail_percentage 24971 1727096443.66882: done checking for max_fail_percentage 24971 1727096443.66882: checking to see if all hosts have failed and the running result is not ok 24971 1727096443.66883: done checking to see if all hosts have failed 24971 1727096443.66884: getting the remaining hosts for this loop 24971 1727096443.66885: done getting the remaining hosts for this loop 24971 1727096443.66888: getting the next task for host managed_node3 24971 1727096443.66896: done getting next task for host managed_node3 24971 1727096443.66898: ^ task is: TASK: Clean up namespace 24971 1727096443.66901: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096443.66904: getting variables 24971 1727096443.66905: in VariableManager get_vars() 24971 1727096443.66940: Calling all_inventory to load vars for managed_node3 24971 1727096443.66942: Calling groups_inventory to load vars for managed_node3 24971 1727096443.66944: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096443.66954: Calling all_plugins_play to load vars for managed_node3 24971 1727096443.66957: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096443.66960: Calling groups_plugins_play to load vars for managed_node3 24971 1727096443.68241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096443.69818: done with get_vars() 24971 1727096443.69843: done getting variables 24971 1727096443.69908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Monday 23 September 2024 09:00:43 -0400 (0:00:00.050) 0:00:31.177 ****** 24971 1727096443.69938: entering _queue_task() for managed_node3/command 24971 1727096443.70280: worker is 1 (out of 1 available) 24971 1727096443.70293: exiting _queue_task() for managed_node3/command 24971 1727096443.70306: done queuing things up, now waiting for results queue to drain 24971 1727096443.70307: waiting for pending results... 24971 1727096443.70592: running TaskExecutor() for managed_node3/TASK: Clean up namespace 24971 1727096443.70663: in run() - task 0afff68d-5257-3482-6844-0000000000b4 24971 1727096443.70683: variable 'ansible_search_path' from source: unknown 24971 1727096443.70717: calling self._execute() 24971 1727096443.70822: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.70826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.70832: variable 'omit' from source: magic vars 24971 1727096443.71215: variable 'ansible_distribution_major_version' from source: facts 24971 1727096443.71233: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096443.71238: variable 'omit' from source: magic vars 24971 1727096443.71258: variable 'omit' from source: magic vars 24971 1727096443.71296: variable 'omit' from source: magic vars 24971 1727096443.71341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096443.71379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096443.71399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096443.71416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096443.71427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096443.71463: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096443.71473: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.71477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.71585: Set connection var ansible_shell_type to sh 24971 1727096443.71588: Set connection var ansible_shell_executable to /bin/sh 24971 1727096443.71672: Set connection var ansible_timeout to 10 24971 1727096443.71675: Set connection var ansible_connection to ssh 24971 1727096443.71677: Set connection var ansible_pipelining to False 24971 1727096443.71679: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096443.71681: variable 'ansible_shell_executable' from source: unknown 24971 1727096443.71683: variable 'ansible_connection' from source: unknown 24971 1727096443.71687: variable 'ansible_module_compression' from source: unknown 24971 1727096443.71689: variable 'ansible_shell_type' from source: unknown 24971 1727096443.71691: variable 'ansible_shell_executable' from source: unknown 24971 1727096443.71693: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096443.71695: variable 'ansible_pipelining' from source: unknown 24971 1727096443.71697: variable 'ansible_timeout' from source: unknown 24971 1727096443.71699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096443.71819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096443.71823: variable 'omit' from source: magic vars 24971 1727096443.71826: starting attempt loop 24971 1727096443.71828: running the handler 24971 1727096443.71831: _low_level_execute_command(): starting 24971 1727096443.71833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096443.72737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096443.72741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.72744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.72763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096443.72768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096443.72772: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096443.72775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.72778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.72797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.72877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.74530: stdout chunk (state=3): >>>/root <<< 24971 1727096443.74703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.74707: stdout chunk (state=3): >>><<< 24971 1727096443.74709: stderr chunk (state=3): >>><<< 24971 1727096443.74745: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.74765: _low_level_execute_command(): starting 24971 1727096443.74794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435 `" && echo ansible-tmp-1727096443.7475185-26351-178143788355435="` echo /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435 `" ) && sleep 0' 24971 1727096443.75426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096443.75532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.75536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096443.75538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096443.75541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.75551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.75619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.75636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.75697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.77608: stdout chunk (state=3): >>>ansible-tmp-1727096443.7475185-26351-178143788355435=/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435 <<< 24971 1727096443.77740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.77754: stderr chunk (state=3): >>><<< 24971 1727096443.77866: stdout chunk (state=3): >>><<< 24971 1727096443.77872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096443.7475185-26351-178143788355435=/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.77875: variable 'ansible_module_compression' from source: unknown 24971 1727096443.77902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096443.77944: variable 'ansible_facts' from source: unknown 24971 1727096443.78051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py 24971 1727096443.78234: Sending initial data 24971 1727096443.78243: Sent initial data (156 bytes) 24971 1727096443.78841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096443.78886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 24971 1727096443.78906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.78986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.79007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.79023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.79046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.79101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.80686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24971 1727096443.80689: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096443.80712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096443.80751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp1w60c1g2 /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py <<< 24971 1727096443.80754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py" <<< 24971 1727096443.80785: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp1w60c1g2" to remote "/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py" <<< 24971 1727096443.80788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py" <<< 24971 1727096443.81263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.81445: stderr chunk (state=3): >>><<< 24971 1727096443.81449: stdout chunk (state=3): >>><<< 24971 1727096443.81452: done transferring module to remote 24971 1727096443.81454: _low_level_execute_command(): starting 24971 1727096443.81457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/ /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py && sleep 0' 24971 1727096443.82008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.82012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096443.82014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.82016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.82018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096443.82024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.82064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.82083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096443.82105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.82162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096443.83961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096443.83973: stdout chunk (state=3): >>><<< 24971 1727096443.83983: stderr chunk (state=3): >>><<< 24971 1727096443.84001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096443.84004: _low_level_execute_command(): starting 24971 1727096443.84007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/AnsiballZ_command.py && sleep 0' 24971 1727096443.84450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.84454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.84456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096443.84458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 24971 1727096443.84460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096443.84507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096443.84510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096443.84559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.00436: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-23 09:00:43.996077", "end": "2024-09-23 09:00:44.001311", "delta": "0:00:00.005234", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096444.02040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096444.02077: stderr chunk (state=3): >>><<< 24971 1727096444.02080: stdout chunk (state=3): >>><<< 24971 1727096444.02175: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-23 09:00:43.996077", "end": "2024-09-23 09:00:44.001311", "delta": "0:00:00.005234", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096444.02179: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096444.02183: _low_level_execute_command(): starting 24971 1727096444.02185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096443.7475185-26351-178143788355435/ > /dev/null 2>&1 && sleep 0' 24971 1727096444.02759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.02772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.02784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.02798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.02847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.02850: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096444.02858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.02861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096444.02863: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096444.02866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096444.02870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.02872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.02883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.02954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.02958: stderr chunk (state=3): >>>debug2: match found <<< 24971 1727096444.02961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.02971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.02987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.03011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.03065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.04949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.04953: stdout chunk (state=3): >>><<< 24971 1727096444.04955: stderr chunk (state=3): >>><<< 24971 1727096444.04974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.05073: handler run complete 24971 1727096444.05076: Evaluated conditional (False): False 24971 1727096444.05079: attempt loop complete, returning result 24971 1727096444.05081: _execute() done 24971 1727096444.05083: dumping result to json 24971 1727096444.05084: done dumping result, returning 24971 1727096444.05086: done running TaskExecutor() for managed_node3/TASK: Clean up namespace [0afff68d-5257-3482-6844-0000000000b4] 24971 1727096444.05088: sending task result for task 0afff68d-5257-3482-6844-0000000000b4 ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.005234", "end": "2024-09-23 09:00:44.001311", "rc": 0, "start": "2024-09-23 09:00:43.996077" } 24971 1727096444.05265: no more pending results, returning what we have 24971 1727096444.05271: results queue empty 24971 1727096444.05272: checking for any_errors_fatal 24971 1727096444.05279: done checking for any_errors_fatal 24971 1727096444.05279: checking for max_fail_percentage 24971 1727096444.05281: done checking for max_fail_percentage 24971 1727096444.05282: checking to see if all hosts have failed and the running result is not ok 24971 1727096444.05283: done checking to see if all hosts have failed 24971 1727096444.05284: getting the remaining hosts for this loop 24971 1727096444.05285: done getting the remaining hosts for this loop 24971 1727096444.05288: getting the next task for host managed_node3 24971 1727096444.05295: done getting next task for host managed_node3 24971 1727096444.05298: ^ task is: TASK: Verify network state restored to default 24971 1727096444.05300: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096444.05304: getting variables 24971 1727096444.05307: in VariableManager get_vars() 24971 1727096444.05349: Calling all_inventory to load vars for managed_node3 24971 1727096444.05352: Calling groups_inventory to load vars for managed_node3 24971 1727096444.05354: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096444.05366: Calling all_plugins_play to load vars for managed_node3 24971 1727096444.05579: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096444.05585: done sending task result for task 0afff68d-5257-3482-6844-0000000000b4 24971 1727096444.05588: WORKER PROCESS EXITING 24971 1727096444.05592: Calling groups_plugins_play to load vars for managed_node3 24971 1727096444.07225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096444.08825: done with get_vars() 24971 1727096444.08852: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Monday 23 September 2024 09:00:44 -0400 (0:00:00.390) 0:00:31.567 ****** 24971 1727096444.08951: entering _queue_task() for managed_node3/include_tasks 24971 1727096444.09493: worker is 1 (out of 1 available) 24971 1727096444.09503: exiting _queue_task() for managed_node3/include_tasks 24971 1727096444.09513: done queuing things up, now waiting for results queue to drain 24971 1727096444.09514: waiting for pending results... 24971 1727096444.09612: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 24971 1727096444.09727: in run() - task 0afff68d-5257-3482-6844-0000000000b5 24971 1727096444.09754: variable 'ansible_search_path' from source: unknown 24971 1727096444.09800: calling self._execute() 24971 1727096444.09909: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.09924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.09938: variable 'omit' from source: magic vars 24971 1727096444.10369: variable 'ansible_distribution_major_version' from source: facts 24971 1727096444.10392: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096444.10408: _execute() done 24971 1727096444.10416: dumping result to json 24971 1727096444.10423: done dumping result, returning 24971 1727096444.10431: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0afff68d-5257-3482-6844-0000000000b5] 24971 1727096444.10442: sending task result for task 0afff68d-5257-3482-6844-0000000000b5 24971 1727096444.10710: no more pending results, returning what we have 24971 1727096444.10717: in VariableManager get_vars() 24971 1727096444.10775: Calling all_inventory to load vars for managed_node3 24971 1727096444.10778: Calling groups_inventory to load vars for managed_node3 24971 1727096444.10780: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096444.10786: done sending task result for task 0afff68d-5257-3482-6844-0000000000b5 24971 1727096444.10789: WORKER PROCESS EXITING 24971 1727096444.10802: Calling all_plugins_play to load vars for managed_node3 24971 1727096444.10806: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096444.10809: Calling groups_plugins_play to load vars for managed_node3 24971 1727096444.12254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096444.13937: done with get_vars() 24971 1727096444.13964: variable 'ansible_search_path' from source: unknown 24971 1727096444.13985: we have included files to process 24971 1727096444.13986: generating all_blocks data 24971 1727096444.13988: done generating all_blocks data 24971 1727096444.13993: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24971 1727096444.13994: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24971 1727096444.13998: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24971 1727096444.14413: done processing included file 24971 1727096444.14416: iterating over new_blocks loaded from include file 24971 1727096444.14417: in VariableManager get_vars() 24971 1727096444.14435: done with get_vars() 24971 1727096444.14437: filtering new block on tags 24971 1727096444.14455: done filtering new block on tags 24971 1727096444.14458: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 24971 1727096444.14463: extending task lists for all hosts with included blocks 24971 1727096444.17256: done extending task lists 24971 1727096444.17259: done processing included files 24971 1727096444.17259: results queue empty 24971 1727096444.17260: checking for any_errors_fatal 24971 1727096444.17265: done checking for any_errors_fatal 24971 1727096444.17266: checking for max_fail_percentage 24971 1727096444.17269: done checking for max_fail_percentage 24971 1727096444.17270: checking to see if all hosts have failed and the running result is not ok 24971 1727096444.17271: done checking to see if all hosts have failed 24971 1727096444.17271: getting the remaining hosts for this loop 24971 1727096444.17273: done getting the remaining hosts for this loop 24971 1727096444.17275: getting the next task for host managed_node3 24971 1727096444.17279: done getting next task for host managed_node3 24971 1727096444.17281: ^ task is: TASK: Check routes and DNS 24971 1727096444.17284: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096444.17287: getting variables 24971 1727096444.17288: in VariableManager get_vars() 24971 1727096444.17311: Calling all_inventory to load vars for managed_node3 24971 1727096444.17313: Calling groups_inventory to load vars for managed_node3 24971 1727096444.17315: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096444.17323: Calling all_plugins_play to load vars for managed_node3 24971 1727096444.17325: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096444.17327: Calling groups_plugins_play to load vars for managed_node3 24971 1727096444.18502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096444.20053: done with get_vars() 24971 1727096444.20082: done getting variables 24971 1727096444.20126: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 09:00:44 -0400 (0:00:00.112) 0:00:31.679 ****** 24971 1727096444.20158: entering _queue_task() for managed_node3/shell 24971 1727096444.20516: worker is 1 (out of 1 available) 24971 1727096444.20527: exiting _queue_task() for managed_node3/shell 24971 1727096444.20539: done queuing things up, now waiting for results queue to drain 24971 1727096444.20540: waiting for pending results... 24971 1727096444.20899: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 24971 1727096444.21015: in run() - task 0afff68d-5257-3482-6844-00000000075e 24971 1727096444.21103: variable 'ansible_search_path' from source: unknown 24971 1727096444.21107: variable 'ansible_search_path' from source: unknown 24971 1727096444.21111: calling self._execute() 24971 1727096444.21195: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.21214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.21231: variable 'omit' from source: magic vars 24971 1727096444.21645: variable 'ansible_distribution_major_version' from source: facts 24971 1727096444.21672: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096444.21685: variable 'omit' from source: magic vars 24971 1727096444.21732: variable 'omit' from source: magic vars 24971 1727096444.21869: variable 'omit' from source: magic vars 24971 1727096444.21872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096444.21883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096444.21912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096444.21939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096444.21959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096444.22006: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096444.22016: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.22025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.22139: Set connection var ansible_shell_type to sh 24971 1727096444.22192: Set connection var ansible_shell_executable to /bin/sh 24971 1727096444.22195: Set connection var ansible_timeout to 10 24971 1727096444.22198: Set connection var ansible_connection to ssh 24971 1727096444.22204: Set connection var ansible_pipelining to False 24971 1727096444.22207: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096444.22223: variable 'ansible_shell_executable' from source: unknown 24971 1727096444.22231: variable 'ansible_connection' from source: unknown 24971 1727096444.22238: variable 'ansible_module_compression' from source: unknown 24971 1727096444.22246: variable 'ansible_shell_type' from source: unknown 24971 1727096444.22289: variable 'ansible_shell_executable' from source: unknown 24971 1727096444.22292: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.22300: variable 'ansible_pipelining' from source: unknown 24971 1727096444.22304: variable 'ansible_timeout' from source: unknown 24971 1727096444.22306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.22808: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096444.22812: variable 'omit' from source: magic vars 24971 1727096444.22814: starting attempt loop 24971 1727096444.22816: running the handler 24971 1727096444.22819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096444.22821: _low_level_execute_command(): starting 24971 1727096444.22823: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096444.23947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.23959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.23975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.23992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.24096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.24376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.24642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.26163: stdout chunk (state=3): >>>/root <<< 24971 1727096444.26294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.26298: stdout chunk (state=3): >>><<< 24971 1727096444.26309: stderr chunk (state=3): >>><<< 24971 1727096444.26336: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.26349: _low_level_execute_command(): starting 24971 1727096444.26383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553 `" && echo ansible-tmp-1727096444.2633498-26365-35136815385553="` echo /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553 `" ) && sleep 0' 24971 1727096444.27010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.27176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24971 1727096444.27188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096444.27192: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.27196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.27198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.27238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.29211: stdout chunk (state=3): >>>ansible-tmp-1727096444.2633498-26365-35136815385553=/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553 <<< 24971 1727096444.29400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.29523: stdout chunk (state=3): >>><<< 24971 1727096444.29526: stderr chunk (state=3): >>><<< 24971 1727096444.29529: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096444.2633498-26365-35136815385553=/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.29669: variable 'ansible_module_compression' from source: unknown 24971 1727096444.29816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096444.29887: variable 'ansible_facts' from source: unknown 24971 1727096444.30049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py 24971 1727096444.30404: Sending initial data 24971 1727096444.30407: Sent initial data (155 bytes) 24971 1727096444.31230: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.31242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.31308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.33075: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096444.33079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096444.33082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py" <<< 24971 1727096444.33084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkap44a0q /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py <<< 24971 1727096444.33087: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmpkap44a0q" to remote "/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py" <<< 24971 1727096444.34539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.34543: stdout chunk (state=3): >>><<< 24971 1727096444.34548: stderr chunk (state=3): >>><<< 24971 1727096444.34611: done transferring module to remote 24971 1727096444.34620: _low_level_execute_command(): starting 24971 1727096444.34625: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/ /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py && sleep 0' 24971 1727096444.35746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.35759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.35774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.35787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.35801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.36066: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096444.36074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.36078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.36085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.36087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.36297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.36418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.38249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.38254: stdout chunk (state=3): >>><<< 24971 1727096444.38311: stderr chunk (state=3): >>><<< 24971 1727096444.38315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.38318: _low_level_execute_command(): starting 24971 1727096444.38321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/AnsiballZ_command.py && sleep 0' 24971 1727096444.39508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.39511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.39616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.39620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.39622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.39624: stderr chunk (state=3): >>>debug2: match not found <<< 24971 1727096444.39626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.39628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24971 1727096444.39630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 24971 1727096444.39632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24971 1727096444.39633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.39684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.39736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.40095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.40099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.40171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.56378: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3035sec preferred_lft 3035sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:00:44.551818", "end": "2024-09-23 09:00:44.560530", "delta": "0:00:00.008712", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096444.58223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096444.58227: stdout chunk (state=3): >>><<< 24971 1727096444.58230: stderr chunk (state=3): >>><<< 24971 1727096444.58256: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3035sec preferred_lft 3035sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:00:44.551818", "end": "2024-09-23 09:00:44.560530", "delta": "0:00:00.008712", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096444.58409: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096444.58413: _low_level_execute_command(): starting 24971 1727096444.58416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096444.2633498-26365-35136815385553/ > /dev/null 2>&1 && sleep 0' 24971 1727096444.59091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.59112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.59129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.59288: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.59395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.59446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.59476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.61332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.61346: stdout chunk (state=3): >>><<< 24971 1727096444.61373: stderr chunk (state=3): >>><<< 24971 1727096444.61398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.61410: handler run complete 24971 1727096444.61438: Evaluated conditional (False): False 24971 1727096444.61456: attempt loop complete, returning result 24971 1727096444.61488: _execute() done 24971 1727096444.61491: dumping result to json 24971 1727096444.61493: done dumping result, returning 24971 1727096444.61507: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0afff68d-5257-3482-6844-00000000075e] 24971 1727096444.61575: sending task result for task 0afff68d-5257-3482-6844-00000000075e 24971 1727096444.61651: done sending task result for task 0afff68d-5257-3482-6844-00000000075e 24971 1727096444.61654: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008712", "end": "2024-09-23 09:00:44.560530", "rc": 0, "start": "2024-09-23 09:00:44.551818" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3035sec preferred_lft 3035sec inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 24971 1727096444.61737: no more pending results, returning what we have 24971 1727096444.61742: results queue empty 24971 1727096444.61743: checking for any_errors_fatal 24971 1727096444.61745: done checking for any_errors_fatal 24971 1727096444.61746: checking for max_fail_percentage 24971 1727096444.61747: done checking for max_fail_percentage 24971 1727096444.61748: checking to see if all hosts have failed and the running result is not ok 24971 1727096444.61749: done checking to see if all hosts have failed 24971 1727096444.61750: getting the remaining hosts for this loop 24971 1727096444.61751: done getting the remaining hosts for this loop 24971 1727096444.61755: getting the next task for host managed_node3 24971 1727096444.61761: done getting next task for host managed_node3 24971 1727096444.61764: ^ task is: TASK: Verify DNS and network connectivity 24971 1727096444.61767: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 24971 1727096444.61776: getting variables 24971 1727096444.61778: in VariableManager get_vars() 24971 1727096444.61820: Calling all_inventory to load vars for managed_node3 24971 1727096444.61827: Calling groups_inventory to load vars for managed_node3 24971 1727096444.61830: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096444.61843: Calling all_plugins_play to load vars for managed_node3 24971 1727096444.61848: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096444.61851: Calling groups_plugins_play to load vars for managed_node3 24971 1727096444.63701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096444.65405: done with get_vars() 24971 1727096444.65430: done getting variables 24971 1727096444.65502: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 09:00:44 -0400 (0:00:00.453) 0:00:32.133 ****** 24971 1727096444.65533: entering _queue_task() for managed_node3/shell 24971 1727096444.66187: worker is 1 (out of 1 available) 24971 1727096444.66198: exiting _queue_task() for managed_node3/shell 24971 1727096444.66209: done queuing things up, now waiting for results queue to drain 24971 1727096444.66210: waiting for pending results... 24971 1727096444.66686: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 24971 1727096444.66731: in run() - task 0afff68d-5257-3482-6844-00000000075f 24971 1727096444.66751: variable 'ansible_search_path' from source: unknown 24971 1727096444.67037: variable 'ansible_search_path' from source: unknown 24971 1727096444.67041: calling self._execute() 24971 1727096444.67085: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.67096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.67111: variable 'omit' from source: magic vars 24971 1727096444.67840: variable 'ansible_distribution_major_version' from source: facts 24971 1727096444.67915: Evaluated conditional (ansible_distribution_major_version != '6'): True 24971 1727096444.68251: variable 'ansible_facts' from source: unknown 24971 1727096444.69381: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 24971 1727096444.69392: variable 'omit' from source: magic vars 24971 1727096444.69440: variable 'omit' from source: magic vars 24971 1727096444.69483: variable 'omit' from source: magic vars 24971 1727096444.69535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24971 1727096444.69581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24971 1727096444.69603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24971 1727096444.69627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096444.69643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24971 1727096444.69678: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24971 1727096444.69725: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.69728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.69792: Set connection var ansible_shell_type to sh 24971 1727096444.69804: Set connection var ansible_shell_executable to /bin/sh 24971 1727096444.69820: Set connection var ansible_timeout to 10 24971 1727096444.69834: Set connection var ansible_connection to ssh 24971 1727096444.69848: Set connection var ansible_pipelining to False 24971 1727096444.69858: Set connection var ansible_module_compression to ZIP_DEFLATED 24971 1727096444.69888: variable 'ansible_shell_executable' from source: unknown 24971 1727096444.69942: variable 'ansible_connection' from source: unknown 24971 1727096444.69945: variable 'ansible_module_compression' from source: unknown 24971 1727096444.69947: variable 'ansible_shell_type' from source: unknown 24971 1727096444.69949: variable 'ansible_shell_executable' from source: unknown 24971 1727096444.69951: variable 'ansible_host' from source: host vars for 'managed_node3' 24971 1727096444.69958: variable 'ansible_pipelining' from source: unknown 24971 1727096444.69959: variable 'ansible_timeout' from source: unknown 24971 1727096444.69962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24971 1727096444.70080: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096444.70094: variable 'omit' from source: magic vars 24971 1727096444.70100: starting attempt loop 24971 1727096444.70105: running the handler 24971 1727096444.70116: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24971 1727096444.70158: _low_level_execute_command(): starting 24971 1727096444.70160: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24971 1727096444.70983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.71020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.71045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.71083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.71147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.72804: stdout chunk (state=3): >>>/root <<< 24971 1727096444.72963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.72967: stdout chunk (state=3): >>><<< 24971 1727096444.72972: stderr chunk (state=3): >>><<< 24971 1727096444.73091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.73095: _low_level_execute_command(): starting 24971 1727096444.73098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336 `" && echo ansible-tmp-1727096444.7299747-26420-135925461529336="` echo /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336 `" ) && sleep 0' 24971 1727096444.73722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.73757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.73781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.73802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.73868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.75800: stdout chunk (state=3): >>>ansible-tmp-1727096444.7299747-26420-135925461529336=/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336 <<< 24971 1727096444.75958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.75962: stdout chunk (state=3): >>><<< 24971 1727096444.75965: stderr chunk (state=3): >>><<< 24971 1727096444.75987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096444.7299747-26420-135925461529336=/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.76024: variable 'ansible_module_compression' from source: unknown 24971 1727096444.76173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24971ctsu8jan/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24971 1727096444.76177: variable 'ansible_facts' from source: unknown 24971 1727096444.76224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py 24971 1727096444.76424: Sending initial data 24971 1727096444.76427: Sent initial data (156 bytes) 24971 1727096444.77296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.77343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.77358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.77388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24971 1727096444.77407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 24971 1727096444.77499: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.77530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.77546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.77574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.77698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.79304: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24971 1727096444.79355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24971 1727096444.79410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp24_9jm93 /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py <<< 24971 1727096444.79435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24971ctsu8jan/tmp24_9jm93" to remote "/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py" <<< 24971 1727096444.80150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.80177: stdout chunk (state=3): >>><<< 24971 1727096444.80204: stderr chunk (state=3): >>><<< 24971 1727096444.80227: done transferring module to remote 24971 1727096444.80314: _low_level_execute_command(): starting 24971 1727096444.80318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/ /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py && sleep 0' 24971 1727096444.80888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.80903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.80982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.81023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.81040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24971 1727096444.81064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.81138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096444.83050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096444.83053: stdout chunk (state=3): >>><<< 24971 1727096444.83056: stderr chunk (state=3): >>><<< 24971 1727096444.83157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096444.83160: _low_level_execute_command(): starting 24971 1727096444.83164: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/AnsiballZ_command.py && sleep 0' 24971 1727096444.84282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24971 1727096444.84286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24971 1727096444.84288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24971 1727096444.84440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.84511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096444.84590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096444.84705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24971 1727096444.84796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096445.20722: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3220 0 --:--:-- --:--:-- --:--:-- 3244\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3491 0 --:--:-- --:--:-- --:--:-- 3506", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:00:44.998976", "end": "2024-09-23 09:00:45.202811", "delta": "0:00:00.203835", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24971 1727096445.22650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 24971 1727096445.22655: stdout chunk (state=3): >>><<< 24971 1727096445.22663: stderr chunk (state=3): >>><<< 24971 1727096445.22666: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3220 0 --:--:-- --:--:-- --:--:-- 3244\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3491 0 --:--:-- --:--:-- --:--:-- 3506", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:00:44.998976", "end": "2024-09-23 09:00:45.202811", "delta": "0:00:00.203835", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 24971 1727096445.22684: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24971 1727096445.22687: _low_level_execute_command(): starting 24971 1727096445.22689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096444.7299747-26420-135925461529336/ > /dev/null 2>&1 && sleep 0' 24971 1727096445.24186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24971 1727096445.24311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 24971 1727096445.24685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24971 1727096445.26497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24971 1727096445.26541: stderr chunk (state=3): >>><<< 24971 1727096445.26544: stdout chunk (state=3): >>><<< 24971 1727096445.26618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24971 1727096445.26621: handler run complete 24971 1727096445.26624: Evaluated conditional (False): False 24971 1727096445.26626: attempt loop complete, returning result 24971 1727096445.26628: _execute() done 24971 1727096445.26630: dumping result to json 24971 1727096445.26636: done dumping result, returning 24971 1727096445.26651: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0afff68d-5257-3482-6844-00000000075f] 24971 1727096445.26666: sending task result for task 0afff68d-5257-3482-6844-00000000075f ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.203835", "end": "2024-09-23 09:00:45.202811", "rc": 0, "start": "2024-09-23 09:00:44.998976" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3220 0 --:--:-- --:--:-- --:--:-- 3244 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3491 0 --:--:-- --:--:-- --:--:-- 3506 24971 1727096445.26983: no more pending results, returning what we have 24971 1727096445.26987: results queue empty 24971 1727096445.26988: checking for any_errors_fatal 24971 1727096445.27001: done checking for any_errors_fatal 24971 1727096445.27002: checking for max_fail_percentage 24971 1727096445.27004: done checking for max_fail_percentage 24971 1727096445.27005: checking to see if all hosts have failed and the running result is not ok 24971 1727096445.27006: done checking to see if all hosts have failed 24971 1727096445.27006: getting the remaining hosts for this loop 24971 1727096445.27008: done getting the remaining hosts for this loop 24971 1727096445.27012: getting the next task for host managed_node3 24971 1727096445.27020: done getting next task for host managed_node3 24971 1727096445.27022: ^ task is: TASK: meta (flush_handlers) 24971 1727096445.27024: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096445.27035: getting variables 24971 1727096445.27037: in VariableManager get_vars() 24971 1727096445.27197: Calling all_inventory to load vars for managed_node3 24971 1727096445.27200: Calling groups_inventory to load vars for managed_node3 24971 1727096445.27203: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096445.27381: Calling all_plugins_play to load vars for managed_node3 24971 1727096445.27386: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096445.27389: Calling groups_plugins_play to load vars for managed_node3 24971 1727096445.28242: done sending task result for task 0afff68d-5257-3482-6844-00000000075f 24971 1727096445.28247: WORKER PROCESS EXITING 24971 1727096445.29586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096445.32506: done with get_vars() 24971 1727096445.32589: done getting variables 24971 1727096445.32792: in VariableManager get_vars() 24971 1727096445.32809: Calling all_inventory to load vars for managed_node3 24971 1727096445.32812: Calling groups_inventory to load vars for managed_node3 24971 1727096445.32814: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096445.32819: Calling all_plugins_play to load vars for managed_node3 24971 1727096445.32822: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096445.32825: Calling groups_plugins_play to load vars for managed_node3 24971 1727096445.35086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096445.37372: done with get_vars() 24971 1727096445.37409: done queuing things up, now waiting for results queue to drain 24971 1727096445.37416: results queue empty 24971 1727096445.37417: checking for any_errors_fatal 24971 1727096445.37422: done checking for any_errors_fatal 24971 1727096445.37423: checking for max_fail_percentage 24971 1727096445.37424: done checking for max_fail_percentage 24971 1727096445.37429: checking to see if all hosts have failed and the running result is not ok 24971 1727096445.37430: done checking to see if all hosts have failed 24971 1727096445.37431: getting the remaining hosts for this loop 24971 1727096445.37432: done getting the remaining hosts for this loop 24971 1727096445.37435: getting the next task for host managed_node3 24971 1727096445.37439: done getting next task for host managed_node3 24971 1727096445.37441: ^ task is: TASK: meta (flush_handlers) 24971 1727096445.37442: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096445.37446: getting variables 24971 1727096445.37448: in VariableManager get_vars() 24971 1727096445.37464: Calling all_inventory to load vars for managed_node3 24971 1727096445.37466: Calling groups_inventory to load vars for managed_node3 24971 1727096445.37470: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096445.37477: Calling all_plugins_play to load vars for managed_node3 24971 1727096445.37479: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096445.37482: Calling groups_plugins_play to load vars for managed_node3 24971 1727096445.39958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096445.42651: done with get_vars() 24971 1727096445.42675: done getting variables 24971 1727096445.42732: in VariableManager get_vars() 24971 1727096445.42751: Calling all_inventory to load vars for managed_node3 24971 1727096445.42754: Calling groups_inventory to load vars for managed_node3 24971 1727096445.42756: Calling all_plugins_inventory to load vars for managed_node3 24971 1727096445.42761: Calling all_plugins_play to load vars for managed_node3 24971 1727096445.42764: Calling groups_plugins_inventory to load vars for managed_node3 24971 1727096445.42766: Calling groups_plugins_play to load vars for managed_node3 24971 1727096445.45583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24971 1727096445.48255: done with get_vars() 24971 1727096445.48296: done queuing things up, now waiting for results queue to drain 24971 1727096445.48299: results queue empty 24971 1727096445.48299: checking for any_errors_fatal 24971 1727096445.48301: done checking for any_errors_fatal 24971 1727096445.48302: checking for max_fail_percentage 24971 1727096445.48303: done checking for max_fail_percentage 24971 1727096445.48303: checking to see if all hosts have failed and the running result is not ok 24971 1727096445.48304: done checking to see if all hosts have failed 24971 1727096445.48305: getting the remaining hosts for this loop 24971 1727096445.48306: done getting the remaining hosts for this loop 24971 1727096445.48309: getting the next task for host managed_node3 24971 1727096445.48312: done getting next task for host managed_node3 24971 1727096445.48313: ^ task is: None 24971 1727096445.48315: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24971 1727096445.48316: done queuing things up, now waiting for results queue to drain 24971 1727096445.48317: results queue empty 24971 1727096445.48317: checking for any_errors_fatal 24971 1727096445.48318: done checking for any_errors_fatal 24971 1727096445.48319: checking for max_fail_percentage 24971 1727096445.48320: done checking for max_fail_percentage 24971 1727096445.48320: checking to see if all hosts have failed and the running result is not ok 24971 1727096445.48321: done checking to see if all hosts have failed 24971 1727096445.48323: getting the next task for host managed_node3 24971 1727096445.48325: done getting next task for host managed_node3 24971 1727096445.48326: ^ task is: None 24971 1727096445.48327: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=76 changed=2 unreachable=0 failed=0 skipped=62 rescued=0 ignored=0 Monday 23 September 2024 09:00:45 -0400 (0:00:00.828) 0:00:32.961 ****** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.89s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.57s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Create veth interface veth0 --------------------------------------------- 1.25s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.89s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.86s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install iproute --------------------------------------------------------- 0.84s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Verify DNS and network connectivity ------------------------------------- 0.83s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Install iproute --------------------------------------------------------- 0.83s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.82s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.73s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Ensure ping6 command is present ----------------------------------------- 0.70s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.60s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.55s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Get ipv6 routes --------------------------------------------------------- 0.50s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 24971 1727096445.48455: RUNNING CLEANUP